Super Thinking: The Big Book of Mental Models
Rate it:
Open Preview
Read between July 4 - July 24, 2019
1%
Flag icon
What is elementary, worldly wisdom? Well, the first rule is that you can’t really know anything if you just remember isolated facts and try and bang ’em back. If the facts don’t hang together on a latticework of theory, you don’t have them in a usable form. You’ve got to have models in your head. And you’ve got to array your experience—both vicarious and direct—on this latticework of models.
2%
Flag icon
And the models have to come from multiple disciplines—because all the wisdom of the world is not to be found in one little academic department. . . . You’ve got to have models across a fair array of disciplines. You may say, “My God, this is already getting way too tough.” But, fortunately, it isn’t that tough—because 80 or 90 important models will carry about 90 percent of the freight in making you a worldly-wise person. And, of those, only a mere handful really carry very heavy freight.
2%
Flag icon
When I urge a multidisciplinary approach . . . I’m really asking you to ignore jurisdictional boundaries. If you want to be a good thinker, you must develop a mind that can jump these boundaries. You don’t have to know it all. Just take in the best big ideas from all these disciplines. And it’s not that hard to do.
2%
Flag icon
“The best time to plant a tree was twenty years ago. The second best time is now.”
2%
Flag icon
Carl Jacobi was a nineteenth-century German mathematician who often used to say, “Invert, always invert” (actually he said, “Man muss immer umkehren,” because English wasn’t his first language). He meant that thinking about a problem from an inverse perspective can unlock new solutions and strategies. For example, most people approach investing their money from the perspective of making more money; the inverse approach would be investing money from the perspective of not losing money.
2%
Flag icon
Or consider healthy eating. A direct approach would be to try to construct a healthy diet, perhaps by making more food at home with controlled ingredients. An inverse approach, by contrast, would be to try to avoid unhealthy options. You might still go to all the same eating establishment but simply choose the healthier options when there.
2%
Flag icon
The inverse of being right more is being wrong less. Mental models are a tool set that can help you be wrong less. They are a collection of concepts that help you more effectively navigate our complex world.
3%
Flag icon
Thinking alone, though, even from first principles, only gets you so far. Your first principles are merely assumptions that may be true, false, or somewhere in between. Do you really value autonomy in a job, or do you just think you do? Is it really true you need to go back to school to switch careers, or might it actually be unnecessary? Ultimately, to be wrong less, you also need to be testing your assumptions in the real world, a process known as de-risking. There is risk that one or more of your assumptions are untrue, and so the conclusions you reach could also be false.
4%
Flag icon
The Greco-Roman astronomer Ptolemy (circa A.D. 90–168) stated, “We consider it a good principle to explain the phenomena by the simplest hypotheses possible.” More recently, the composer Roger Sessions, paraphrasing Albert Einstein, put it like this: “Everything should be made as simple as it can be, but not simpler!” In medicine, it’s known by this saying: “When you hear hoofbeats, think of horses, not zebras.”
6%
Flag icon
Another way of giving people the benefit of the doubt for their behavior is called Hanlon’s razor: never attribute to malice that which is adequately explained by carelessness. Like Ockham’s razor, Hanlon’s razor seeks out the simplest explanation. And when people do something harmful, the simplest explanation is usually that they took the path of least resistance. That is, they carelessly created the negative outcome; they did not cause the outcome out of malice.
7%
Flag icon
Another tactical model to help you have greater empathy is the veil of ignorance, put forth by philosopher John Rawls. It holds that when thinking about how society should be organized, we should do so by imagining ourselves ignorant of our particular place in the world, as if there were a veil preventing us from knowing who we are. Rawls refers to this as the “original position.” For example, you should not just consider your current position as a free person when contemplating a world where slavery is allowed. You must consider the possibility that you might have been born a slave, and how ...more
8%
Flag icon
A second mental model that can help you with confirmation bias is the Devil’s advocate position. This was once an official position in the Catholic Church used during the process of canonizing people as saints. Once someone is canonized, the decision is eternal, so it was critical to get it right. Hence this position was created for someone to advocate from the Devil’s point of view against the deceased person’s case for sainthood.
8%
Flag icon
As Charlie Munger says, “I never allow myself to have an opinion on anything that I don’t know the other side’s argument better than they do.”
9%
Flag icon
As you can see, you can ask as many questions as you need in order to get to the root cause—five is just an arbitrary number. Nobel Prize–winning physicist Richard Feynman was on the Rogers Commission, agreeing to join upon specific request even though he was then dying of cancer. He uncovered the organizational failure within NASA and threatened to resign from the commission unless its report included an appendix consisting of his personal thoughts around root cause, which reads in part: It appears that there are enormous differences of opinion as to the probability of a failure with loss of ...more
9%
Flag icon
We started this chapter explaining that to be wrong less, you need to both work at getting better over time (antifragile) and make fewer avoidable mistakes in your thinking (unforced errors). Unfortunately, there are a lot of mental traps that you actively need to try to avoid, such as relying too much on recent information (availability bias), being too wed to your existing position (confirmation bias), and overstating the likelihood of your desired outcome (optimistic probability bias). As Feynman warned Caltech graduates in 1974: “You must not fool yourself—and you are the easiest person to ...more
9%
Flag icon
To avoid mental traps, you must think more objectively. Try arguing from first principles, getting to root causes, and seeking out the third story.
9%
Flag icon
Realize that your intuitive interpretations of the world can often be wrong due to availability bias, fundamental attribution error, optimistic probability bias, and other related mental models that explain common errors in thinking.
9%
Flag icon
Use Ockham’s razor and Hanlon’s razor to begin investigating the simplest objective explanations. Then test your theories by de-risking your assu...
This highlight has been truncated due to consecutive passage length restrictions.
10%
Flag icon
Attempt to think gray in an effort to consistently avoid confirmation bias. Actively seek out other perspectives by including the Devil’s advocate position and bypassing the filter bubble. Consider the adage “You are what you eat.” You need to take in a variety of foods to be a healthy person. Like...
This highlight has been truncated due to consecutive passage length restrictions.
10%
Flag icon
You’ve probably gone out to dinner with friends expecting that you will equally split the check. At dinner, each person is faced with a decision to order an expensive meal or a cheaper one. When dining alone, people often order the cheaper meal. However, when they know that the cost of dinner is shared by the whole group, people tend to opt for the expensive meal. If everyone does this then everyone ends up paying more!
12%
Flag icon
This phenomenon, known as moral hazard, is where you take on more risk, or hazard, once you have information that encourages you to believe you are more protected. It has been a concern of the insurance industry since the seventeenth century! Sometimes moral hazard may involve only one person: wearing a bike helmet may give you a false sense of security, leading you to bike more recklessly, but you are the one who bears all the costs of a bike crash.
13%
Flag icon
Goodhart’s law summarizes the issue: When a measure becomes a target, it ceases to be a good measure.
13%
Flag icon
In A Short History of Nearly Everything, Bill Bryson describes a situation in which paleontologist Gustav Heinrich Ralph von Koenigswald accidentally created perverse incentives on an expedition: Koenigswald’s discoveries might have been more impressive still but for a tactical error that was realized too late. He had offered locals ten cents for every piece of hominid bone they could come up with, then discovered to his horror that they had been enthusiastically smashing large pieces into small ones to maximize their income.
14%
Flag icon
In “Chilling Effects: Online Surveillance and Wikipedia Use,” Oxford researcher Jonathon Penny studied Wikipedia traffic patterns before and after the 2013 revelations by Edward Snowden about the U.S. National Security Agency’s internet spying tactics, finding a 20 percent decline in terrorism-related article views involving terms like al-Qaeda, Taliban, and car bomb. The implication is that when people realized they were being watched by their governments, some of them stopped reading articles that they thought could get them into trouble. The name for this concept is chilling effect.
15%
Flag icon
These unintended consequences are likely to arise when people don’t plan for the long term. From finance, short-termism describes these types of situations, when you focus on short-term results, such as quarterly earnings, over long-term results, such as five-year profits. If you focus on just short-term financial results, you won’t invest enough in the future. Eventually you will be left behind by competitors who are making those long-term investments, or you could be swiftly disrupted by new upstarts (which we cover in Chapter 9).
19%
Flag icon
Sayre’s law, named after political scientist Wallace Sayre, offers that in any dispute the intensity of feeling is inversely proportional to the value of the issues at stake. A related concept is Parkinson’s law of triviality, named after naval historian Cyril Parkinson, which states that organizations tend to give disproportionate weight to trivial issues. Both of these concepts explain how group dynamics can lead the group to focus on the wrong things.
23%
Flag icon
Once you overcome procrastination and are actually making consistent progress toward a goal, the next trap you can fall into is failing to plan your time effectively. Parkinson’s law (yes, another law by the same Parkinson of Parkinson’s law of triviality) states that “work expands so as to fill the time available for its completion.” Does that ring true for you? It certainly does for us. When your top priority has a deadline far in the future, it doesn’t mean that you need to spend all your time on it until the deadline. The sooner you finish, the sooner you can move on to the next item on ...more
24%
Flag icon
what has worked or what has not worked in the past. Architect Christopher Alexander introduced the concept of a design pattern, which is a reusable solution to a design problem. This idea has been adapted to other fields and is especially popular in computer science.
26%
Flag icon
This inertia can lead to suboptimal decisions, referred to as a strategy tax.
26%
Flag icon
The Shirky principle states, Institutions will try to preserve the problem to which they are the solution.
26%
Flag icon
The Lindy effect is the name of this phenomenon. It was popularized by Nassim Taleb in his book Antifragile, which we mentioned in Chapter 1. Taleb explains: If a book has been in print for forty years, I can expect it to be in print for another forty years. But, and that is the main difference, if it survives another decade, then it will be expected to be in print another fifty years. This, simply, as a rule, tells you why things that have been around for a long time are not “aging” like persons, but “aging” in reverse. Every year that passes without extinction doubles the additional life ...more
27%
Flag icon
Momentum is a model that can help you understand how things change. Momentum and inertia are related concepts. In physics, momentum is the product (multiplication) of mass and velocity, whereas inertia is just a function of mass. That means a heavy object at rest has a lot of inertia since it is hard to move, but it has no momentum since its velocity is zero. However, a heavy object gets momentum quickly once it starts moving. The faster an object goes, the more momentum it has. However, its inertia remains the same (since its mass remains the same), and it is still similarly difficult to ...more
31%
Flag icon
The natural increase of entropy over time in a closed system is known as the second law of thermodynamics. Thermodynamics is the study of heat. If you consider our universe as the biggest closed system, this law leads to a plausible end state of our universe as a homogenous gas, evenly distributed everywhere, commonly known as the heat death of the universe.
34%
Flag icon
In general, survey results can be influenced by response bias in a number of ways, including the following: How questions are worded, e.g., leading or loaded questions The order of questions, where earlier questions can influence later ones Poor or inaccurate memory of respondents Difficulty representing feelings in a number, such as one-to-ten ratings Respondents reporting things that reflect well on themselves
42%
Flag icon
Unfortunately, studies are much, much more likely to be published if they show statistically significant results, which causes publication bias. Studies that fail to find statistically significant results are still scientifically meaningful, but both researchers and publications have a bias against them for a variety of reasons. For example, there are only so many pages in a publication, and given the choice, publications would rather publish studies with significant findings over ones with none. That’s because successful studies are more likely to attract attention from media and other ...more
42%
Flag icon
The publication of false positives like this directly contributes to the replication crisis and can delay scientific progress by influencing future research toward these false hypotheses. And the fact that negative results aren’t always reported can also lead to different people testing the same negative hypotheses over and over again because no one knows other people have tried them.
43%
Flag icon
While useful in some simple cases, this basic pro-con methodology has significant shortcomings. First, the list presumes there are only two options, when as you just saw there are usually many more. Second, it presents all pros and cons as if they had equal weight. Third, a pro-con list treats each item independently, whereas these factors are often interrelated. A fourth problem is that since the pros are often more obvious than the cons, this disparity can lead to a grass-is-greener mentality, causing you mentally to accentuate the positives (e.g., greener grass) and overlook the negatives.
46%
Flag icon
there is a philosophy called utilitarianism that expresses the view that the most ethical decision is the one that creates the most utility for all involved.
47%
Flag icon
In a normal distribution, rare events occur on the tails of the distribution (e.g., really tall or short people), far from the middle of the bell curve. Black swan events, though, often come from fat-tailed distributions, which literally have fatter tails, meaning that events way out from the middle have a much higher probability when compared with a normal distribution.
47%
Flag icon
Chatelier’s principle, named after French chemist Henri-Louis Le Chatelier, states that when any chemical system at equilibrium is subject to a change in conditions, such as a shift in temperature, volume, or pressure, it readjusts itself into a new equilibrium state and usually partially counteracts the change. For example, if someone hands you a box to carry, you don’t immediately topple over; you instead shift your weight distribution to account for the new weight. Or in economics, if a new tax is introduced, tax revenues from that tax end up being lower in the long run than one would ...more
47%
Flag icon
Monte Carlo simulation. Like critical mass (see Chapter 4), this is a model that emerged during the Manhattan Project in Los Alamos in the run-up to the discovery of the atomic bomb. Physicist Stanislaw Ulam was struggling with using traditional mathematics to determine how far neutrons would travel through various materials and came up with this new method after playing solitaire (yes, the card game).
48%
Flag icon
A Monte Carlo simulation is actually many simulations run independently, with random initial conditions or other uses of random numbers within the simulation itself. By running a simulation of a system many times, you can begin to understand how probable different outcomes really are. Think of it as a dynamic sensitivity analysis.
49%
Flag icon
These types of what-if questions can also be applied to the past, in what is called counterfactual thinking, which means thinking about the past by imagining that the past was different, counter to the facts of what actually occurred. You’ve probably seen this model in books and movies about scenarios such as what would have happened if Germany had won World War II (e.g., Philip K. Dick’s The Man in the High Castle).
49%
Flag icon
Posing what-if questions can nevertheless help you think more creatively, coming up with scenarios that diverge from your intuition. More generally, this technique is one of many associated with lateral thinking, a type of thinking that helps you move laterally from one idea to another, as opposed to critical thinking, which is more about judging an idea in front of you. Lateral thinking is thinking outside the box.
49%
Flag icon
There are many ways to manage groupthink, though, including setting a culture of questioning assumptions, making sure to evaluate all ideas critically, establishing a Devil’s advocate position (see Chapter 1), actively recruiting people with differing opinions, reducing leadership’s influence on group recommendations, and splitting the group into independent subgroups.
50%
Flag icon
In a book entitled Superforecasting, Tetlock examines characteristics that lead superforecasters to make such accurate predictions. As it happens, these are good characteristics to cultivate in general: Intelligence: Brainpower is crucial, especially the ability to enter a new domain and get up to speed quickly. Domain expertise: While you can learn about a particular domain on the fly, the more you learn about it, the more it helps. Practice: Good forecasting is apparently a skill you can hone and get better at over time. Working in teams: Groups of people can outperform individuals as long ...more
50%
Flag icon
When tempted to use a pro-con list, consider upgrading to a cost-benefit analysis or decision tree as appropriate. When making any quantitative assessment, run a sensitivity analysis across inputs to uncover key drivers and appreciate where you may need to seek greater accuracy in your assumptions. Pay close attention to any discount rate used. Beware of black swan events and unknown unknowns. Use systems thinking and scenario analysis to more systematically uncover them and assess their impact. For really complex systems or decision spaces, consider simulations to help you better assess what ...more
51%
Flag icon
Getting into an arms race is not beneficial to anyone involved. There is usually no clear end to the race, as all sides continually eat up resources that could be spent more usefully elsewhere. Think about how much better it would be if the money spent on making campuses luxurious was instead invested in better teaching and other areas that directly impact the quality and accessibility of a college education.
53%
Flag icon
If you want a crash course on the use of these mental models in real-life just go to a casino, where all of them are used simultaneously to ultimately take your money. Casinos give away a lot of free stuff (reciprocity); they get you to first buy chips with cash (commitment); they try to personalize your experience to your interests (liking); they show you examples of other people who won big (social proof); they constantly present you with offers preying on your fear of missing out (scarcity); and dealers will even give you suboptimal advice (authority). Beware. There is a reason why the ...more
54%
Flag icon
When you consider something from a market perspective (like babysitting for money), you consider it in the context of your own financial situation and its impact on you in an impersonal way (“I can earn sixty dollars, but it may not be worth my time”). In contrast, when you consider something from the social perspective (like doing your friend a favor), you consider it in the context of whether it is the right thing to do (“My friend needs my help for four hours, so I am going to help her”).
« Prev 1