More on this book
Community
Kindle Notes & Highlights
Because of feedback delays within complex systems, by the time a problem becomes apparent it may be unnecessarily difficult to solve.
-A stitch in time saves nine.
According to the competitive exclusion principle, if a reinforcing feedback loop rewards the winner of a competition with the means to win further competitions, the result will be the elimination of all but a few competitors.
For he that hath, to him shall be given; and he that hath not, from him shall be taken even that which he hath (Mark 4:25) or
-The rich get richer and the poor get poorer.
A diverse system with multiple pathways and redundancies is
more stable and less vulnerable to external shock than a uniform system with little diversity...
This highlight has been truncated due to consecutive passage length restrictions.
I have yet to see any problem, however complicated, which, when looked at in the right way, did not become still more complicated.
-FOUL ANDERSON1
a system must consist of three kinds of things: elements, interconnections, and a function or purpose.
You think that because you understand "one" that you must therefore understand "two" because one and one make two. But you forget that you must also understand "and."
Purposes are deduced from behavior, not from rhetoric or stated goals.
System purposes need not be human purposes and are not necessarily those intended by any single actor within the system.
You can understand the relative importance of a system's elements, interconnections, and purposes by imagining them changed one by one.
If A causes B, is it possible that B also causes A?
You'll stop looking for who's to blame; instead you'll start asking, "What's the system?" The concept of feedback opens up the idea that a system can cause its own behavior.
The ... goal of all theory is to make the ... basic elements as simple and as few as possible without having to surrender the adequate representation of ... experience.
-Albert Einstein,' physicist
The information delivered by a feedback loop-even nonphysical feedback-can only affect future behavior; it can't deliver a signal fast enough to correct behavior that drove the current feedback. Even nonphysical information takes time to feedback into the system.
Systems with similar feedback structures produce similar dynamic behaviors.
One of the central insights of systems theory, as central as the observation that systems largely cause their own behavior, is that systems with similar feedback structures produce similar dynamic behaviors, even if the outward appearance of these systems is completely dissimilar.
A delay in a balancing feedback loop makes a system likely to oscillate.
Placing a system in a straitjacket of constancy can cause fragility to evolve.
-C. S. Holling,2 ecologist
Resilience is a measure of a system's ability to survive and persist within a variable environment. The opposite of resilience is brittleness or rigidity.
There are always limits to resilience.
Resilience is not the same thing as being static or constant over time. Resilient systems can be very dynamic. Short-term oscillations, or periodic outbreaks, or long cycles of succession, climax, and collapse may in fact be the normal condition, which resilience acts to restore!
Large organizations of all kinds, from corporations to governments, lose their resilience simply because the feedback mechanisms by which they sense and respond to their environment have to travel through too many layers of delay and distortion.
This capacity of a system to make its own structure more complex is called self-organization.
Like resilience, self-organization is often sacrificed for purposes of short-term productivity and stability.
Self-organization produces heterogeneity and unpredictability. It is likely
to come up with whole new structures, whole new ways of doing things. It requires freedom and experimentation, and a certain amount of disorder. These conditions that encourage self-organization often can be scary for individuals and threatening to power structures.
Even complex forms of self-organization may arise from relatively simple organizing rules-or may not.
Complex systems can evolve from simple systems only if there are stable intermediate forms. The resulting complex forms will naturally be hierarchic.
Hierarchies are brilliant systems inventions, not only because they give a system stability and resilience, but also because they reduce the amount of information that any part of the system has to keep track of.
The original purpose of a hierarchy is always to help its originating subsystems do their jobs better. This is something, unfortunately, that both the higher and the lower levels of a greatly articulated hierarchy easily can forget. Therefore, many systems are not meeting our goals because of malfunctioning hierarchies.
When a subsystem's goals dominate at the expense of the total system's goals, the resulting behavior is called suboptimization.
enough central control to achieve coordination toward the large-system goal, and enough autonomy to keep all subsystems flourishing, functioning, and self-organizing.
The purpose of the upper layers of the hierarchy is to serve the purposes of the lower layers.
Everything we think we know about the world is a model. Every word and every language is a model. All maps and statistics, books and databases, equations and computer programs are models. So are the ways I picture the world in my head-my mental models. None of these is or ever will be the real world.
We often draw illogical conclusions from accurate assumptions, or logical conclusions from inaccurate assumptions.
Systems fool us by presenting themselves-or we fool ourselves by seeing the world-as a series of events.
When a systems thinker encounters a problem, the first thing he or she does is look for data, time graphs, the history of the system.
long-term behavior provides clues to the underlying system structure. And structure is the key to understanding not just what is happening, but why.
System structure is the source of system behavior. System behavior reveals itself as a series of events over time.
There are no separate systems. The world is a continuum. Where to draw a boundary around a system depends on the purpose of the discussion-the questions we want to ask.
It's a great art to remember that boundaries are of our own making, and that they can and should be reconsidered for each new discussion, problem, or purpose.
At any given time, the input that is most important to a system is the one that is most limiting.
Any physical entity with multiple inputs and outputs is surrounded by layers of limits.
There always will be limits to growth. They can be self-imposed. If they aren't, they will be system-imposed.
I realize with fright that my impatience for the re-establishment of democracy had something almost communist in it; or, more generally, something rationalist. I had wanted to make history move ahead in the same way that a child pulls on a plant to make it grow more quickly.
I believe we must learn to wait as we learn to create. We have to patiently sow the seeds, assiduously water the earth where they are sown and give the plants the time that is their own. One cannot fool a plant any more than one can fool history.
-Vaclav Havel,' playwright, last President of Czechoslovakia and first
...more
Delays are ubiquitous in systems.
When there are long delays in feedback loops, some sort of foresight is essential.
Bounded rationality means that people make quite reasonable decisions based on the information they have. But they don't have perfect information, especially about more distant parts of the system.
we don't even make decisions that optimize our own individual good, much less the good of the system as a whole.