The Great Mental Models Volume 3: Systems and Mathematics
Rate it:
Read between December 5 - December 11, 2021
16%
Flag icon
When we study a complex system, it’s beneficial to consider how its functioning behaves differently at different scales. Looking at the micro level may mislead us about the macro, and vice versa. In general, systems become more complex as they scale up. Greater size means more connections and interdependencies between parts. Thus, it’s important to combine scale with bottlenecks.
16%
Flag icon
If you do not look at things on a large scale, it will be difficult to master strategy.
16%
Flag icon
But things will always be different as a system scales, and a collection of teams within a company will never be able to communicate like a small company. The larger the company grows, the more work it takes to ensure information flows to the right places.
16%
Flag icon
As changes to the system are implemented in response to growth, the question always is: How will this system fare in the next year? Ten years? A hundred years? In other words, how well will it age?
16%
Flag icon
As growth occurs, resilience can be increased by keeping a measure of independence between parts of a system. Dependencies tend to age poorly because they rely on every one of their dependencies aging well.
17%
Flag icon
Scaling up from the small to the large is often accompanied by an evolution from simplicity to complexity while maintaining basic elements or building blocks of the system unchanged or conserved.
17%
Flag icon
Understanding that systems can scale nonlinearly is useful because it helps us appreciate how much a system can change as it grows.
18%
Flag icon
Jane Brox writes in Brilliant that “gaslight divided light—and life—from its singular, self-reliant past. All was now interconnected, contingent, and intricate.”19 People’s homes became part of a larger system.
19%
Flag icon
Artificial light increased the scale of what we could see at night and thus opened up new businesses and new ways of conducting one’s day. Festivities and holiday celebrations began to move later and later into the evening.
19%
Flag icon
Artificial light changed the scale at which human activities can happen. In many ways, the limits of our lights are the limits of our world. There are still places where we lack the means to eradicate darkness, such as outer space and the deepest parts of the oceans.
19%
Flag icon
When you scale up a system, the problems you solved at the smaller scale often need solving again at a larger scale. In addition, you end up with unanticipated possibilities and outcomes. As the scale increases, so does its impact on other systems.
19%
Flag icon
there are often new impacts and requirements as the system develo...
This highlight has been truncated due to consecutive passage length restrictions.
19%
Flag icon
A more interconnected, larger system may be able to handle variations better, but it may also be vulnerable to widespread failures. Increasing the scale of a system might mean using new materials or incorporating methods like the ones that worked on a smaller scale. It might mean rethinking your whole approach.
19%
Flag icon
Systems change as they scale up or down, and neither is intrinsically better or worse. The right scale depends on your goals and the context. If you want to scale something up, you need to anticipate that new problems will keep arising—problems that didn’t exist at a smaller scale. Or you might need to keep solving the same problems in different ways.
20%
Flag icon
When we interact with complex systems, we need to expect the unexpected. Systems do not always function as anticipated. They are subject to variable conditions and can respond to inputs in nonlinear ways. A margin of safety is often necessary to ensure systems can handle stressors and unpredictable circumstances. This means there is a meaningful gap between what a system is capable of handling and what it is required to handle. A margin of safety is a buffer between safety and danger, order and chaos, success and failure. It ensures a system does not swing from one to the other too easily, ...more
20%
Flag icon
This world of ours appears to be separated by a slight and precarious margin of safety from a most singular and unexpected danger. » Arthur Conan Doyle1
20%
Flag icon
engineers know to design for extremes, not averages. In engineering, it’s necessary to consider the most something might need to...
This highlight has been truncated due to consecutive passage length restrictions.
20%
Flag icon
many more than 5,000 cars cross it in a day. A large margin of safety doesn’t eliminate the possibility of failure, but it reduces it.
20%
Flag icon
For investors, a margin of safety is the gap between an investment vehicle’s intrinsic value and its price. The higher the margin of safety, the safer the investment and the greater the potential profit. Since intrinsic value is subjective, it’s best this buffer be as large as possible to account for uncertainty.
20%
Flag icon
When calculating the ideal margin of safety, we always need to consider how high the stakes are. The greater the cost of failure, the bigger the buffer should be.
20%
Flag icon
A system can’t keep working indefinitely without anything breaking down. A system without backups is unlikely to function for long.
20%
Flag icon
If you’re going hiking in the wilderness alone, you might want more than one communication method. You’re safer in an airplane than a car, in part because it has so much backup; after all, the cost of failure is higher.
20%
Flag icon
margins of safety sometimes create perverse incentives. If we change our behavior in response to the knowledge that we have a margin of safety in place, we may end up reducing or negating its benefits.
20%
Flag icon
There is a difference between what’s uncomfortable and what ruins you. Most systems can be down for an hour. Our bodies can go without food or water for days. Most businesses can do without revenue for a little while. Too much margin of safety could be a waste of resources and can sow the seeds of becoming uncompetitive.
21%
Flag icon
The more we learn, the fewer blind spots we have. And blind spots are the source of all mistakes. While learning more than we need to get the job done can appear inefficient, the corresponding reduction in blind spots offers a margin of safety. Knowledge allows us to adapt to changing situations.
21%
Flag icon
“over time, I learned how to anticipate problems in order to prevent them, and how to respond effectively in critical situations.”
21%
Flag icon
the ability to parse and solve complex problems rapidly, with incomplete information in a hostile environment—was not something any of us had been born with. But by this point we all had it. We’d developed it on the job.”
22%
Flag icon
Our ego gets in the way of capitalizing on the margin of safety that is produced by knowing more than you need to. Often we learn enough to solve today’s problem but not enough to solve tomorrow’s. There is no margin of safety in what we know.
22%
Flag icon
life will throw at you challenges that require capabilities outside your natural strengths. The only way to be ready is to first build as vast a repertoire of knowledge as you can in anticipation of the possibilities you might face, and second to cultivate the ability to know what is relevant and useful.
22%
Flag icon
“truly being ready means understanding what could go wrong and having a plan to deal with it.”
22%
Flag icon
The professionals plan for “mild randomness” and misunderstand “wild randomness.” They learn from the averages and overlook the outliers. Thus they consistently, predictably, underestimate catastrophic risk.
22%
Flag icon
We cannot have a backup plan for everything. We do too much in a day or a year to devote the resources necessary to plan for dealing with disaster in all of our endeavors. However, when the stakes are high, it is worth investing in a comprehensive margin of safety. Extreme events require extreme preparation.
22%
Flag icon
“To lead is to anticipate” was the motto of Jacques Jaujard*, director of the French National Museums during World War II.
22%
Flag icon
Jaujard’s experiences taught him it was best to move Paris’s treasures away if there was any risk whatsoever of attack.15 That way, no matter what, France could hold on to a piece of its pride knowing part of its culture was safe.
23%
Flag icon
We can learn from Jaujard’s removal of artwork from Paris during the war the importance of building in a significant margin of safety when the risk of failure is high. The future is seldom predictable, and so the greater the threat, the more it’s important to plan for the worst.
23%
Flag icon
Broad competence seems very costly compared to specialization, but it is more likely to save us in the outlier situations of life. Efficiency is good for small tasks where failure has little consequence, but life is not exclusively filled with minor challenges and minimal consequences. We are all going to face extreme events where failure is disastrous.
23%
Flag icon
A margin of safety can be an excellent buffer against the unexpected, giving us time to effectively adapt.
24%
Flag icon
Since churn of some sort is inevitable in all systems, it’s useful to ask how we can use it to our benefit. Is it worth going through contortions to keep every customer, or should we let a certain percentage go and focus on the core customers who keep our business going?
24%
Flag icon
Understanding your situation through the lens of churn can help you figure out how to harness the dynamics that drive it.
24%
Flag icon
When we have a customer retention rate of 90%, we may think we’re doing great. But over time, the 5% difference between us and our competitor means we have less growth and have to work a lot harder to keep up.
26%
Flag icon
Algorithms turn inputs into outputs. One reason they are worth understanding is because many systems adjust and respond based on the information provided by algorithms.
26%
Flag icon
Another reason is that they can help systems scale. Once you identify a set of steps that solve a particular problem, you don’t need to start from scratch every time.
26%
Flag icon
“Algorithm” is arguably the single most important concept in our world. If we want to understand our life and our future, we should make every effort to understand what an algorithm is, and how algorithms are connected with emotions. An algorithm is a methodical set of steps that can be used to make calculations, resolve problems, and reach decisions. An algorithm isn’t a particular calculation, but the method followed when making the calculation. » Yuval Noah Harari1
26%
Flag icon
Algorithms are useful partly because of the inherent predictability of their process. That’s why we like them. We can think of algorithms as a series of if–then statements that are completely unambiguous.
26%
Flag icon
In Intuition Pumps and Other Tools for Thinking, Daniel Dennett* defines an algorithm as “a certain sort of formal process that can be counted on—logically—to yield a certain sort...
This highlight has been truncated due to consecutive passage length restrictions.
26%
Flag icon
three defining characteristics of algorithms: Substrate neutrality: “The power of the procedure is due to its logical structure, not the causal powers of the materials used in the instantiation.”3 It doesn’t matter whether you read your recipe on a phone or a book; neither has impact on the logic of the algorithm. Underlying mindlessness: “Each constituent step, and the transition between steps, is utterly simple.”4 For a recipe to be an algorithm, it must tell you the amounts of each ingredient you need as well as walk you through the process in steps so clear that there is no room for ...more
27%
Flag icon
human learning as being the product of biological algorithms.
27%
Flag icon
Moving beyond computers, all systems need algorithms to function: sets of instructions for adapting to and solving problems. Increasingly, algorithms are designed to be directionally correct over perfect.
27%
Flag icon
They often evolve—or are designed—to get useful and relevant enough outputs to keep the system functioning properly.
27%
Flag icon
When groups of people work together with a shared goal, they need coherent algorithms for turning their inputs into their desired outputs in a repeatable fashion. For many people to move toward the same aim, they must know how to act, how to resolve problems, and how to make decisions in a consistent and reliable manner.