More on this book
Community
Kindle Notes & Highlights
For other systems with this same structure of competing balancing loops, the fact that the stock goes on changing while you’re trying to control it can create real problems.
The information delivered by a feedback loop—even nonphysical feedback—can only affect future behavior; it can’t deliver a signal fast enough to correct behavior that drove the current feedback.
It says that a flow can’t react instantly to a flow. It can react only to a change in a stock, and only after a slight delay to register the incoming information.
Many economic models make a mistake in this matter by assuming that consumption or production can respond immediately, say, to a change in price.
A stock-maintaining balancing feedback loop must have its goal set appropriately to compensate for draining or inflowing processes that affect that stock. Otherwise, the feedback process will fall short of or exceed the target for the stock.
A Stock with One Reinforcing Loop and One Balancing Loop—Population and Industrial Economy
A population has a reinforcing loop causing it to grow through its birth rate, and a balancing loop causing it to die off through its death rate.
It grows exponentially or dies off, depending on whether its reinforcing feedback loop determining births is stronger than its balancing feedback loop determining deaths.
Dominance is an important concept in systems thinking. When one loop dominates another, it has a stronger impact on behavior. Because systems often have several competing feedback loops operating simultaneously, those loops that dominate the system will determine the behavior.
Complex behaviors of systems often arise as the relative strengths of feedback loops shift, causing first one loop and then another to dominate behavior.
Dynamic systems studies usually are not designed to predict what will happen. Rather, they’re designed to explore what would happen, if a number of driving factors unfold in a range of different ways.
QUESTIONS FOR TESTING THE VALUE OF A MODEL Are the driving factors likely to unfold this way? If they did, would the system react this way? What is driving the driving factors?
Model utility depends not on whether its driving scenarios are realistic (since no one can know that for sure), but on whether it responds with a realistic pattern of behavior.
In fact, just about any long-term model of a real economy should link together the two structures of population and capital to show how they affect each other.
One of the central insights of systems theory, as central as the observation that systems largely cause their own behavior, is that systems with similar feedback structures produce similar dynamic behaviors, even if the outward appearance of these systems is completely dissimilar.
A System with Delays—Business Inventory
This system is a version of the thermostat system—one balancing loop of sales draining the inventory stock and a competing balancing loop maintaining the inventory by resupplying what is lost in sales.
Although this system still consists of just two balancing loops, like the simplified thermostat system, it doesn’t behave like the thermostat system.
It’s because she is struggling to operate in a system in which she doesn’t have, and can’t have, timely information and in which physical delays prevent her actions from having an immediate effect on inventory.
A delay in a balancing feedback loop makes a system likely to oscillate.
This perverse kind of result can be seen all the time—someone trying to fix a system is attracted intuitively to a policy lever that in fact does have a strong effect on the system. And then the well-intentioned fixer pulls the lever in the wrong direction! This is just one example of how we can be surprised by the counterintuitive behavior of systems when we start trying to change them.
Changing the delays in a system can make it much easier or much harder to manage. You can see why system thinkers are somewhat fanatic on the subject of delays. We’re always on the alert to see where delays occur in systems, how long they are, whether they are delays in information streams or in physical processes.
That very large system, with interconnected industries responding to each other through delays, entraining each other in their oscillations, and being amplified by multipliers and speculators, is the primary cause of business cycles.
Economies are extremely complex systems; they are full of balancing feedback loops with delays, and they are inherently oscillatory.
A Renewable Stock Constrained by a Nonrenewable Stock—an Oil Economy
Growth in a constrained environment is very common, so common that systems thinkers call it the “limits-to-growth” archetype.
In physical, exponentially growing systems, there must be at least one reinforcing loop driving the growth and at least one balancing loop constraining the growth, because no physical system can grow forever in a finite environment.
Like resources that supply the inflows to a stock, a pollution constraint can be renewable or nonrenewable. It’s nonrenewable if the environment has no capacity to absorb the pollutant or make it harmless. It’s renewable if the environment has a finite, usually variable, capacity for removal.
Whether the constraining balancing loops originate from a renewable or nonrenewable resource makes some difference, not in whether growth can continue forever, but in how growth is likely to end.
A quantity growing exponentially toward a constraint or limit reaches that limit in a surprisingly short time.
Well before depletion makes capital less efficient in one place, companies shift investment to discovery and development of another deposit somewhere else. But, if there are local limits, eventually will there be global ones?
Renewable Stock Constrained by a Renewable Stock—a Fishing Economy
The regeneration rate of the fish is not constant, but is dependent on the number of fish in the area—fish density.
Renewable resources are flow-limited. They can support extraction or harvest indefinitely, but only at a finite flow rate equal to their regeneration rate. If they are extracted faster than they regenerate, they may eventually be driven below a critical threshold and become, for all practical purposes, nonrenewable.
I’ve shown three sets of possible behaviors of this renewable resource system here: overshoot and adjustment to a sustainable equilibrium, overshoot beyond that equilibrium followed by oscillation around it, and overshoot followed by collapse of the resource and the industry dependent on the resource.
The trick, as with all the behavioral possibilities of complex systems, is to recognize what structures contain which latent behaviors, and what conditions release those behaviors—and, where possible, to arrange the structures and conditions to reduce the probability of destructive behaviors and to encourage the possibility of beneficial ones.
Resilience is a measure of a system’s ability to survive and persist within a variable environment. The opposite of resilience is brittleness or rigidity.
A single balancing loop brings a system stock back to its desired state. Resilience is provided by several such loops, operating through different mechanisms, at different time scales, and with redundancy—one kicking in if another one fails.
Resilience is not the same thing as being static or constant over time. Resilient systems can be very dynamic. Short-term oscillations, or periodic outbreaks, or long cycles of succession, climax, and collapse may in fact be the normal condition, which resilience acts to restore!
Static stability is something you can see; it’s measured by variation in the condition of a system week by week or year by year.
Because resilience may not be obvious without a whole-system view, people often sacrifice resilience for stability, or for productivity, or for some other more immediately recognizable system property.
Many chronic diseases, such as cancer and heart disease, come from breakdown of resilience mechanisms that repair DNA, keep blood vessels flexible, or control cell division.
Loss of resilience can come as a surprise, because the system usually is paying much more attention to its play than to its playing space. One day it does something it has done a hundred times before and crashes.
Awareness of resilience enables one to see many ways to preserve or enhance a system’s own restorative powers.
This capacity of a system to make its own structure more complex is called self-organization.
Self-organization is such a common property, particularly of living systems, that we take it for granted.
And if we weren’t nearly blind to the property of self-organization, we would do better at encouraging, rather than destroying, the self-organizing capacities of the systems of which we are a part.
Self-organization produces heterogeneity and unpredictability. It is likely to come up with whole new structures, whole new ways of doing things. It requires freedom and experimentation, and a certain amount of disorder.
Fortunately, self-organization is such a basic property of living systems that even the most overbearing power structure can never fully kill it, although in the name of law and order, self-organization can be suppressed for long, barren, cruel, boring periods.
Systems theorists used to think that self-organization was such a complex property of systems that it could never be understood.