Thinking In Systems Quotes

Rate this book
Clear rating
Thinking In Systems: A Primer Thinking In Systems: A Primer by Donella H. Meadows
21,378 ratings, 4.19 average rating, 1,999 reviews
Open Preview
Thinking In Systems Quotes Showing 151-180 of 302
“Stocks generally change slowly, even when the flows into or out of them change suddenly. Therefore, stocks act as delays or buffers or shock absorbers in systems.”
Donella H. Meadows, Thinking in Systems: A Primer
“Stocks generally change slowly, even when the flows into or out of them change suddenly. Therefore, stocks act as delays”
Donella H. Meadows, Thinking in Systems: A Primer
“The tub can’t fill up immediately, even with the inflow faucet on full blast. A stock takes time to change, because flows take time to flow. That’s a vital point, a key to understanding why systems behave as they do. Stocks usually change slowly. They can act as delays, lags, buffers, ballast, and sources of momentum in a system. Stocks, especially large ones, respond to change, even sudden change, only by gradual filling or emptying.”
Donella H. Meadows, Thinking in Systems: A Primer
“The human mind seems to focus more easily on stocks than on flows. On top of that, when we do focus on flows, we tend to focus on inflows more easily than on outflows. Therefore, we sometimes miss seeing that we can fill a bathtub not only by increasing the inflow rate, but also by decreasing the outflow rate. Everyone understands that you can prolong the life of an oil-based economy by discovering new oil deposits. It seems to be harder to understand that the same result can be achieved by burning less oil.”
Donella H. Meadows, Thinking in Systems: A Primer
“If you understand the dynamics of stocks and flows—their behavior over time—you understand a good deal about the behavior of complex systems.”
Donella H. Meadows, Thinking in Systems: A Primer
“Stocks change over time through the actions of a flow. Flows are filling and draining, births and deaths, purchases and sales, growth and decay, deposits and withdrawals, successes and failures. A stock, then, is the present memory of the history of changing flows within the system.”
Donella H. Meadows, Thinking in Systems: A Primer
“A stock is the memory of the history of changing flows within the system.”
Donella H. Meadows, Thinking in Systems: A Primer
“A stock is the foundation of any system. Stocks are the elements of the system that you can see, feel, count, or measure at any given time. A system stock is just what it sounds like: a store, a quantity, an accumulation of material or information that has built up over time. It may be the water in a bathtub, a population, the books in a bookstore, the wood in a tree, the money in a bank, your own self-confidence.”
Donella H. Meadows, Thinking in Systems: A Primer
“To ask whether elements, interconnections, or purposes are most important in a system is to ask an unsystemic question. All are essential. All interact. All have their roles. But the least obvious part of the system, its function or purpose, is often the most crucial determinant of the system’s behavior. Interconnections are also critically important. Changing relationships usually changes system behavior. The elements, the parts of systems we are most likely to notice, are often (not always) least important in defining the unique characteristics of the system—unless changing an element also results in changing relationships or purpose.”
Donella H. Meadows, Thinking in Systems: A Primer
“If the interconnections change, the system may be greatly altered. It may even become unrecognizable, even though the same players are on the team. Change the rules from those of football to those of basketball, and you’ve got, as they say, a whole new ball game.”
Donella H. Meadows, Thinking in Systems: A Primer
“Changing elements usually has the least effect on the system. If you change all the players on a football team, it is still recognizably a football team. (It may play much better or much worse—particular elements in a system can indeed be important.) A tree changes its cells constantly, its leaves every year or so, but it is still essentially the same tree. Your body replaces most of its cells every few weeks, but it goes on being your body. The university has a constant flow of students and a slower flow of professors and administrators, but it is still a university. In fact it is still the same university, distinct in subtle ways from others, just as General Motors and the U.S. Congress somehow maintain their identities even though all their members change. A system generally goes on being itself, changing only slowly if at all, even with complete substitutions of its elements—as long as its interconnections and purposes remain intact.”
Donella H. Meadows, Thinking in Systems: A Primer
“The word function is generally used for a nonhuman system, the word purpose for a human one, but the distinction is not absolute, since so many systems have both human and nonhuman elements.”
Donella H. Meadows, Thinking in Systems: A Primer
“Systems can change, adapt, respond to events, seek goals, mend injuries, and attend to their own survival in lifelike ways, although they may contain or consist of nonliving things. Systems can be self-organizing, and often are self-repairing over at least some range of disruptions. They are resilient, and many of them are evolutionary. Out of one system other completely new, never-before-imagined systems can arise.”
Donella H. Meadows, Thinking in Systems: A Primer
“When a living creature dies, it loses its “system-ness.” The multiple interrelations that held it together no longer function, and it dissipates, although its material remains part of a larger food-web system.”
Donella H. Meadows, Thinking in Systems: A Primer
“Is there anything that is not a system? Yes—a conglomeration without any particular interconnections or function. Sand scattered on a road by happenstance is not, itself, a system. You can add sand or take away sand and you still have just sand on the road. Arbitrarily add or take away football players, or pieces of your digestive system, and you quickly no longer have the same system.”
Donella H. Meadows, Thinking in Systems: A Primer
“Modern systems theory, bound up with computers and equations, hides the fact that it traffics in truths known at some level by everyone.”
Donella H. Meadows, Thinking in Systems: A Primer
“outside event may unleash that behavior, but the same outside event applied to a different system is likely to produce a different result.”
Donella H. Meadows, Thinking in Systems: A Primer
“In a strict systems sense, there is no long-term, short-term distinction. Phenomena at different time-scales are nested within each other. Actions taken now have some immediate effects and some that radiate out for decades to come. We experience now the consequences of actions set in motion yesterday and decades ago and centuries ago.”
Donella H. Meadows, Thinking in Systems: A Primer
“Intrinsic responsibility” means that the system is designed to send feedback about the consequences of decision making directly and quickly and compellingly to the decision makers. Because the pilot of a plane rides in the front of the plane, that pilot is intrinsically responsible. He or she will experience directly the consequences of his or her decisions.”
Donella H. Meadows, Thinking in Systems: A Primer
“Information is power. Anyone interested in power grasps that idea very quickly. The media, the public relations people, the politicians, and advertisers who regulate much of the public flow of information have far more power than most people realize. They filter and channel information. Often they do so for short-term, self-interested purposes. It’s no wonder that our social systems so often run amok.”
Donella H. Meadows, Thinking in Systems: A Primer
“There is a systematic tendency on the part of human beings to avoid accountability for their own decisions. That’s why there are so many missing feedback loops—and why this kind of leverage point is so often popular with the masses, unpopular with the powers that be, and effective, if you can get the powers that be to permit it to happen (or go around them and make it happen anyway).”
Donella H. Meadows, Thinking in Systems: A Primer
“Missing information flows is one of the most common causes of system malfunction. Adding or restoring information can be a powerful intervention, usually much easier and cheaper than rebuilding physical infrastructure. The tragedy of the commons that is crashing the world’s commercial fisheries occurs because there is little feedback from the state of the fish population to the decision to invest in fishing vessels. Contrary to economic opinion, the price of fish doesn’t provide that feedback. As the fish get more scarce they become more expensive, and it becomes all the more profitable to go out and catch the last few. That’s a perverse feedback, a reinforcing loop that leads to collapse. It is not price information but population information that is needed.”
Donella H. Meadows, Thinking in Systems: A Primer
“Another of Forrester’s classics was his study of urban dynamics, published in 1969, which demonstrated that subsidized low-income housing is a leverage point.3 The less of it there is, the better off the city is—even the low-income folks in the city. This model came out at a time when national policy dictated massive low-income housing projects, and Forrester was derided. Since then, many of those projects have been torn down in city after city. Counterintuitive—that’s Forrester’s word to describe complex systems. Leverage points frequently are not intuitive. Or if they are, we too often use them backward, systematically worsening whatever problems we are trying to solve.”
Donella H. Meadows, Thinking in Systems: A Primer
“This idea of leverage points is not unique to systems analysis—it’s embedded in legend: the silver bullet; the trimtab; the miracle cure; the secret passage; the magic password; the single hero who turns the tide of history; the nearly effortless way to cut through or leap over huge obstacles. We not only want to believe that there are leverage points, we want to know where they are and how to get our hands on them. Leverage points are points of power.”
Donella H. Meadows, Thinking in Systems: A Primer
“If the desired system state is good education, measuring that goal by the amount of money spent per student will ensure money spent per student. If the quality of education is measured by performance on standardized tests, the system will produce performance on standardized tests. Whether either of these measures is correlated with good education is at least worth thinking about.”
Donella H. Meadows, Thinking in Systems: A Primer
“Markets tend toward monopoly and ecological niches toward monotony, but they also create offshoots of diversity, new markets, new species, which in the course of time may attract competitors, which then begin to move the system toward competitive exclusion again.”
Donella H. Meadows, Thinking in Systems: A Primer
“There are two antidotes to eroding goals. One is to keep standards absolute, regardless of performance. Another is to make goals sensitive to the best performances of the past, instead of the worst. If perceived performance has an upbeat bias instead of a downbeat one, if one takes the best results as a standard, and the worst results only as a temporary setback, then the same system structure can pull the system up to better and better performance.”
Donella H. Meadows, Thinking in Systems: A Primer
“The trap called the tragedy of the commons comes about when there is escalation, or just simple growth, in a commonly shared, erodable environment. Ecologist Garrett Hardin described the commons system in a classic article in 1968.”
Donella H. Meadows, Thinking in Systems: A Primer
“Harmonization of goals in a system is not always possible, but it’s an option worth looking for. It can be found only by letting go of more narrow goals and considering the long-term welfare of the entire system.”
Donella H. Meadows, Thinking in Systems: A Primer
“The most effective way of dealing with policy resistance is to find a way of aligning the various goals of the subsystems, usually by providing an overarching goal that allows all actors to break out of their bounded rationality. If everyone can work harmoniously toward the same outcome (if all feedback loops are serving the same goal), the results can be amazing. The most familiar examples of this harmonization of goals are mobilizations of economies during wartime, or recovery after war or natural disaster.”
Donella H. Meadows, Thinking in Systems: A Primer