More on this book
Community
Kindle Notes & Highlights
To shift attention from the abundant factors to the next potential limiting factor is to gain real understanding of, and control over, the growth process.
Any physical entity with multiple inputs and outputs is surrounded by layers of limits.
There always will be limits to growth. They can be self-imposed. If they aren’t, they will be system-imposed.
Delays are ubiquitous in systems.
What is a significant delay depends—usually—on which set of frequencies you’re trying to understand.
Delays determine how fast systems can react, how accurately they hit their targets, and how timely is the information passed around a system. Overshoots, oscillations, and collapses are always caused by delays.
When there are long delays in feedback loops, some sort of foresight is essential. To act only when a problem becomes obvious is to miss an important opportunity to solve the problem.
Bounded rationality means that people make quite reasonable decisions based on the information they have.
The bounded rationality of each actor in a system may not lead to decisions that further the welfare of the system as a whole.
the primary symptom of a balancing feedback loop structure is that not much changes, despite outside forces pushing the system.
This is the systemic trap of “fixes that fail” or “policy resistance.”
Such resistance to change arises when goals of subsystems are different from and inconsistent with each other.
In a policy-resistant system with actors pulling in different directions, everyone has to put great effort into keeping the system where no one wants it to be.
THE TRAP: POLICY RESISTANCE When various actors try to pull a system stock toward various goals, the result can be policy resistance.
There are three ways to avoid the tragedy of the commons. Educate and exhort. Help people to see the consequences of unrestrained use of the commons. Appeal to their morality. Persuade them to be temperate. Threaten transgressors with social disapproval or eternal hellfire. Privatize the commons. Divide it up, so that each person reaps the consequences of his or her own actions. If some people lack the self-control to stay below the carrying capacity of their own private resource, those people will harm only themselves and not others. Regulate the commons. Garrett Hardin calls this option,
...more
THE TRAP: TRAGEDY OF THE COMMONS When there is a commonly shared resource, every user benefits directly from its use, but shares the costs of its abuse with everyone else. Therefore, there is very weak feedback from the condition of the resource to the decisions of the resource users. The consequence is overuse of the resource, eroding it until it becomes unavailable to anyone.
There are two antidotes to eroding goals. One is to keep standards absolute, regardless of performance. Another is to make goals sensitive to the best performances of the past, instead of the worst.
THE TRAP: DRIFT TO LOW PERFORMANCE Allowing performance standards to be influenced by past performance, especially if there is a negative bias in perceiving past performance, sets up a reinforcing feedback loop of eroding goals that sets a system drifting toward low performance. THE WAY OUT Keep performance standards absolute. Even better, let standards be enhanced by the best actual performances instead of being discouraged by the worst. Use the same structure to set up a drift toward high performance!
Escalation, being a reinforcing feedback loop, builds exponentially.
THE TRAP: ESCALATION When the state of one stock is determined by trying to surpass the state of another stock—and vice versa—then there is a reinforcing feedback loop carrying the system into an arms race, a wealth race, a smear campaign, escalating loudness, escalating violence. The escalation is exponential and can lead to extremes surprisingly quickly. If nothing is done, the spiral will be stopped by someone’s collapse—because exponential growth cannot go on forever.
THE TRAP: SUCCESS TO THE SUCCESSFUL If the winners of a competition are systematically rewarded with the means to win again, a reinforcing feedback loop is created by which, if it is allowed to proceed uninhibited, the winners eventually take all, while the losers are eliminated.
One definition of addiction used in Alcoholics Anonymous is repeating the same stupid behavior over and over and over, and somehow expecting different results.
If you are the one with an unsupportable dependency, build your system’s own capabilities back up before removing the intervention. Do it right away. The longer you wait, the harder the withdrawal process will be.
THE TRAP: SHIFTING THE BURDEN TO THE INTERVENOR Shifting the burden, dependence, and addiction arise when a solution to a systemic problem reduces (or disguises) the symptoms, but does nothing to solve the underlying problem. Whether it is a substance that dulls one’s perception or a policy that hides the underlying trouble, the drug of choice interferes with the actions that could solve the real problem. If the intervention designed to correct the problem causes the self-maintaining capacity of the original system to atrophy or erode, then a destructive reinforcing feedback loop is set in
...more
THE TRAP: RULE BEATING Rules to govern a system can lead to rule beating—perverse behavior that gives the appearance of obeying the rules or achieving the goals, but that actually distorts the system.
If you define the goal of a society as GNP, that society will do its best to produce GNP. It will not produce welfare, equity, justice, or efficiency unless you define a goal and regularly measure and report the state of welfare, equity, justice, or efficiency.
THE TRAP: SEEKING THE WRONG GOAL System behavior is particularly sensitive to the goals of feedback loops. If the goals—the indicators of satisfaction of the rules—are defined inaccurately or incompletely, the system may obediently work to produce a result that is not really intended or wanted.
Counterintuitive—that’s Forrester’s word to describe complex systems. Leverage points frequently are not intuitive. Or if they are, we too often use them backward, systematically worsening whatever problems we are trying to solve.
It’s not that parameters aren’t important—they can be, especially in the short term and to the individual who’s standing directly in the flow. People care deeply about such variables as taxes and the minimum wage, and so fight fierce battles over them. But changing these variables rarely changes the behavior of the national economy system.
System goals are parameters that can make big differences.
In chemistry and other fields, a big, stabilizing stock is known as a buffer.
You can often stabilize a system by increasing the capacity of a buffer.5 But if a buffer is too big, the system gets inflexible. It reacts too slowly.
Physical structure is crucial in a system, but is rarely a leverage point, because changing it is rarely quick or simple. The leverage point is in proper design in the first place.
Delays in feedback loops are critical determinants of system behavior. They are common causes of oscillations.
A system just can’t respond to short-term changes when it has long-term delays.
A delay in a feedback process is critical relative to rates of change in the stocks that the feedback loop is trying to control.
I would list delay length as a high leverage point, except for the fact that delays are not often easily changeable.
The strength of a balancing loop—its ability to keep its appointed stock at or near its goal—depends on the combination of all its parameters and links—the accuracy and rapidity of monitoring, the quickness and power of response, the directness and size of corrective flows.
The strength of a balancing feedback loop is important relative to the impact it is designed to correct
Examples of strengthening balancing feedback controls to improve a system’s self-correcting abilities include: preventive medicine, exercise, and good nutrition to bolster the body’s ability to fight disease, integrated pest management to encourage natural predators of crop pests, the Freedom of Information Act to reduce government secrecy, monitoring systems to report on environmental damage, protection for whistleblowers, and impact fees, pollution taxes, and performance bonds to recapture the externalized public costs of private benefits.
Reinforcing feedback loops are sources of growth, explosion, erosion, and collapse in systems.
Reducing the gain around a reinforcing loop—slowing the growth—is usually a more powerful leverage point in systems than strengthening balancing loops, and far more preferable than letting the reinforcing loop run.
Population and economic growth rates in the World model are leverage points, because slowing them gives the many balancing loops, through technology and markets and other forms of adaptation (all of which have limits and delays), time to function. It’s the same as slowing the car when you’re driving too fast, rather than calling for more responsive brakes or technical advances in steering.
Look for leverage points around birth rates, interest rates, erosion rates, “success to the successful” loops, any place where the more you have of something, the more you have the possibility of having more.
Missing information flows is one of the most common causes of system malfunction.
The rules of the system define its scope, its boundaries, its degrees of freedom.
Power over the rules is real power.
If you want to understand the deepest malfunctions of systems, pay attention to the rules and to who has power over them.
The ability to self-organize is the strongest form of system resilience.
Insistence on a single culture shuts down learning and cuts back resilience.