More on this book
Community
Kindle Notes & Highlights
Counterintuitive—that’s Forrester’s word to describe complex systems. Leverage points frequently are not intuitive. Or if they are, we too often use them backward, systematically worsening whatever problems we are trying to solve.
Numbers, the sizes of flows, are dead last on my list of powerful interventions. Diddling with the details, arranging the deck chairs on the Titanic. Probably 90—no 95, no 99 percent—of our attention goes to parameters, but there’s not a lot of leverage in them. It’s not that parameters aren’t important—they can be, especially in the short term and to the individual who’s standing directly in the flow. People care deeply about such variables as taxes and the minimum wage, and so fight fierce battles over them. But changing these variables rarely changes the behavior of the national economy
...more
You can often stabilize a system by increasing the capacity of a buffer.5 But if a buffer is too big, the system gets inflexible. It reacts too slowly. And big buffers of some sorts, such as water reservoirs or inventories, cost a lot to build or maintain. Businesses invented just-in-time inventories, because occasional vulnerability to fluctuations or screw-ups is cheaper (for them, anyway) than certain, constant inventory costs—and because small-to-vanishing inventories allow more flexible response to shifting demand.
Even with immense effort at forecasting, almost every electricity industry in the world experiences long oscillations between overcapacity and undercapacity. A system just can’t respond to short-term changes when it has long-term delays. That’s why a massive central-planning system, such as the Soviet Union or General Motors, necessarily functions poorly.
It's not just the delay though but also the processing power requirements in the center are much higher if little processing of information happens at the edges. Even if there would be no delay, theres no way for one person or committee to keep in its head the problems from all parts of e.g. a country.
Overlong delays in a system with a threshold, a danger point, a range past which irreversible damage can occur, cause overshoot and collapse.
It’s usually easier to slow down the change rate, so that inevitable feedback delays won’t cause so much trouble. That’s why growth rates are higher up on the leverage-point list than delay times.
One of the big mistakes we make is to strip away these “emergency” response mechanisms because they aren’t often used and they appear to be costly. In the short term, we see no effect from doing this. In the long term, we drastically narrow the range of conditions over which the system can survive.
the story of the electric meter in a Dutch housing development—in some of the houses the meter was installed in the basement; in others it was installed in the front hall. With no other differences in the houses, electricity consumption was 30 percent lower in the houses where the meter was in the highly visible location in the front hall. I love that story because it’s an example of a high leverage point in the information structure of the system. It’s not a parameter adjustment, not a strengthening or weakening of an existing feedback loop. It’s a new loop, delivering feedback to a place
...more
The intervention point here is obvious, but unpopular. Encouraging variability and experimentation and diversity means “losing control.”
The goal of keeping the market competitive has to trump the goal of each individual corporation to eliminate its competitors, just as in ecosystems, the goal of keeping populations in balance and evolving has to trump the goal of each population to reproduce without limit.
That sets up am interesting question. Seemingly these are similar situations but for one it's easier to see how human agency in setting goals is relevant. For the other, an ecosystem can easily function completely without humans as well, and has generally sustained competition and dynamic stability (allostasis)
Systems modelers say that we change paradigms by building a model of the system, which takes us outside the system and forces us to see it whole. I say that because my own paradigms have been changed that way.
People who are raised in the industrial world and who get enthused about systems thinking are likely to make a terrible mistake. They are likely to assume that here, in systems analysis, in interconnection and complication, in the power of the computer, here at last, is the key to prediction and control.
What was unique about our search was not our answers, or even our questions, but the fact that the tool of systems thinking, born out of engineering and mathematics, implemented in computers, drawn from a mechanistic mind-set and a quest for prediction and control, leads its practitioners, inexorably I believe, to confront the most deeply human mysteries. Systems thinking makes clear even to the most committed technocrat that getting along in this world of complex systems requires more than technocracy.
The future can’t be predicted, but it can be envisioned and brought lovingly into being. Systems can’t be controlled, but they can be designed and redesigned. We can’t surge forward with certainty into a world of no surprises, but we can expect surprises and learn from them and even profit from them. We can’t impose our will on a system. We can listen to what the system tells us, and discover how its properties and our values can work together to bring forth something much better than could ever be produced by our will alone. We can’t control systems or figure them out. But we can dance with
...more
Before you disturb the system in any way, watch how it behaves. If it’s a piece of music or a whitewater rapid or a fluctuation in a commodity price, study its beat. If it’s a social system, watch it work. Learn its history. Ask people who’ve been around a long time to tell you what has happened. If possible, find or make a time graph of actual data from the system—peoples’ memories are not always reliable when it comes to timing.
And finally, starting with history discourages the common and distracting tendency we all have to define a problem not by the system’s actual behavior, but by the lack of our favorite solution.
Listen to any discussion, in your family or a committee meeting at work or among the pundits in the media, and watch people leap to solutions, usually solutions in “predict, control, or impose your will” mode, without having paid any attention to what the system is doing and why it’s doing it.
You don’t have to put forth your mental model with diagrams and equations, although doing so is a good practice. You can do it with words or lists or pictures or arrows showing what you think is connected to what. The more you do that, in any form, the clearer your thinking will become, the faster you will admit your uncertainties and correct your mistakes, and the more flexible you will learn to be.
Our information streams are composed primarily of language. Our mental models are mostly verbal. Honoring information means above all avoiding language pollution—making the cleanest possible use we can of language.
“Intrinsic responsibility” means that the system is designed to send feedback about the consequences of decision making directly and quickly and compellingly to the decision makers. Because the pilot of a plane rides in the front of the plane, that pilot is intrinsically responsible. He or she will experience directly the consequences of his or her decisions.
Neither we ourselves, nor our associates, nor the publics that need to be involved … can learn what is going on and might go on if we act as if we really had the facts, were really certain about all the issues, knew exactly what the outcomes should/could be, and were really certain that we were attaining the most preferred outcomes.
When you’re walking along a tricky, curving, unknown, surprising, obstacle-strewn path, you’d be a fool to keep your head down and look just at the next step in front of you. You’d be equally a fool just to peer far ahead and never notice what’s immediately under your feet. You need to be watching both the short and the long term—the whole system.

