Thinking in Systems: A Primer
Rate it:
Open Preview
Read between May 6 - June 17, 2025
5%
Flag icon
The system, to a large extent, causes its own behavior! An outside event may unleash that behavior, but the same outside event applied to a different system is likely to produce a different result.
5%
Flag icon
A diverse system with multiple pathways and redundancies is more stable and less vulnerable to external shock than a uniform system with little diversity. — Don’t put all your eggs in one basket.
7%
Flag icon
Systems thinkers call these common structures that produce characteristic behaviors “archetypes.”
7%
Flag icon
The behavior of a system cannot be known just by knowing the elements of which the system is made.
8%
Flag icon
I have yet to see any problem, however complicated, which, when looked at in the right way, did not become still more complicated. —POUL ANDERSON
8%
Flag icon
A system* is an interconnected set of elements that is coherently organized in a way that achieves something. If you look at that definition closely for a minute, you can see that a system must consist of three kinds of things: elements, interconnections, and a function or purpose
8%
Flag icon
A system is more than the sum of its parts. It may exhibit adaptive, dynamic, goal-seeking, self-preserving, and sometimes evolutionary behavior.
9%
Flag icon
Many of the interconnections in systems operate through the flow of information. Information holds systems together and plays a great role in determining how they operate.
10%
Flag icon
Purposes are deduced from behavior, not from rhetoric or stated goals.
10%
Flag icon
Keeping sub-purposes and overall system purposes in harmony is an essential function of successful systems.
16%
Flag icon
Balancing feedback loops are goal-seeking or stability-seeking
16%
Flag icon
Balancing feedback loops are equilibrating or goal-seeking structures in systems and are both sources of stability and sources of resistance to change.
18%
Flag icon
The … goal of all theory is to make the … basic elements as simple and as few as possible without having to surrender the adequate representation of … experience. —Albert Einstein,
24%
Flag icon
Systems with similar feedback structures produce similar dynamic behaviors.
26%
Flag icon
A delay in a balancing feedback loop makes a system likely to oscillate.
33%
Flag icon
Placing a system in a straitjacket of constancy can cause fragility to evolve. —C. S. Holling,2 ecologist
34%
Flag icon
The most marvelous characteristic of some complex systems is their ability to learn, diversify, complexify, evolve.
36%
Flag icon
Systems often have the property of self-organization—the ability to structure themselves, to create new structure, to learn, diversify, and complexify. Even complex forms of self-organization may arise from relatively simple organizing rules—or may not.
39%
Flag icon
When a systems thinker encounters a problem, the first thing he or she does is look for data, time graphs, the history of the system. That’s because long-term behavior provides clues to the underlying system structure. And structure is the key to understanding not just what is happening, but why
39%
Flag icon
System structure is the source of system behavior. System behavior reveals itself as a series of events over time.
40%
Flag icon
Systems thinking goes back and forth constantly between structure (diagrams of stocks, flows, and feedback) and behavior (time graphs).
45%
Flag icon
At any given time, the input that is most important to a system is the one that is most limiting.
48%
Flag icon
Bounded rationality means that people make quite reasonable decisions based on the information they have. But they don’t have perfect information, especially about more distant parts of the system. Fishermen don’t know how many fish there are, much less how many fish will be caught by other fishermen that same day.
49%
Flag icon
Change comes first from stepping outside the limited information that can be seen from any single place in the system and getting an overview. From a wider perspective, information flows, goals, incentives, and disincentives can be restructured so that separate, bounded, rational actions do add up to results that everyone desires.
50%
Flag icon
Being less surprised by complex systems is mainly a matter of learning to expect, appreciate, and use the world’s complexity.
52%
Flag icon
The most effective way of dealing with policy resistance is to find a way of aligning the various goals of the subsystems, usually by providing an overarching goal that allows all actors to break out of their bounded rationality. If everyone can work harmoniously toward the same outcome (if all feedback loops are serving the same goal), the results can be amazing.
56%
Flag icon
Drift to low performance is a gradual process. If the system state plunged quickly, there would be an agitated corrective process. But if it drifts down slowly enough to erase the memory of (or belief in) how much better things used to be, everyone is lulled into lower and lower expectations, lower effort, lower performance.
56%
Flag icon
There are two antidotes to eroding goals. One is to keep standards absolute, regardless of performance. Another is to make goals sensitive to the best performances of the past, instead of the worst. If perceived performance has an upbeat bias instead of a downbeat one, if one takes the best results as a standard, and the worst results only as a temporary setback, then the same system structure can pull the system up to better and better performance.
73%
Flag icon
Encouraging variability and experimentation and diversity means “losing control.” Let a thousand flowers bloom and anything could happen! Who wants that? Let’s play it safe and push this lever in the wrong direction by wiping out biological, cultural, social, and market diversity!
78%
Flag icon
Living successfully in a world of systems requires more of us than our ability to calculate. It requires our full humanity—our rationality, our ability to sort out truth from falsehood, our intuition, our compassion, our vision, and our morality.
79%
Flag icon
I would guess that most of what goes wrong in systems goes wrong because of biased, late, or missing information.
80%
Flag icon
Pay Attention to What Is Important, Not Just What Is Quantifiable