Super Thinking: Upgrade Your Reasoning and Make Better Decisions with Mental Models
Rate it:
Open Preview
10%
Flag icon
You are guilty of the fundamental attribution error whenever you think someone was mean because she is mean rather than thinking she was just having a bad day.
10%
Flag icon
You of course tend to view your own behavior in the opposite way, which is called self-serving bias. When you are the actor, you often have self-serving reasons for your behavior, but when you are the observer, you tend to blame the other’s intrinsic nature. (That’s why this model is also sometimes called actor-observer bias.) For example, if someone runs a red light, you often assume that person is inherently reckless; you do not consider that she might be rushing to the hospital for an ...
This highlight has been truncated due to consecutive passage length restrictions.
10%
Flag icon
Another tactical model to help you have greater empathy is the veil of ignorance, put forth by philosopher John Rawls. It holds that when thinking about how society should be organized, we should do so by imagining ourselves ignorant of our particular place in the world, as if there were a veil prevent...
This highlight has been truncated due to consecutive passage length restrictions.
10%
Flag icon
For example, you should not just consider your current position as a free person when contemplating a world where slavery is allowed. You must consider the possibility that you might have been born a slave, and how that would feel. Or, when considering policies regarding refugees, you must consider the possibility that you could have been one of those seeking refuge. The veil of ignorance encourages yo...
This highlight has been truncated due to consecutive passage length restrictions.
10%
Flag icon
can be challenging to acknowledge that a good portion of your success stems from luck. Many people instead choose to believe that the world is completely fair, orderly, and predictable. This view is called the just world hypothesis,
10%
Flag icon
be challenging to acknowledge that a good portion of your success stems from luck. Many people instead choose to believe that the world is completely fair, orderly, and predictable. This view is called the just world hypothesis, where people always get what they deserve, good or bad, because of their actions alone, with no accounting for luck or randomness. This view is summed up as you reap what you sow.
10%
Flag icon
Learned helplessness describes the tendency to stop trying to escape difficult situations because we have gotten used to difficult conditions over time. Someone learns that they are helpless to control their circumstances, so they give up trying to change them. In a series of experiments summarized in “Learned Helplessness”
10%
Flag icon
Annual Review of Medicine, psychologist Martin Seligman placed dogs in a box where they were repeatedly shocked at random intervals. Then he placed them in a similar box where they could easily escape the shocks. However, they did not actually try to escape; they simply lay down and waited for the shocks to stop. On the other hand, dogs who were not shocked would quickly jump out of the box.
10%
Flag icon
Learned helplessness can be overcome when animals or people see that their actions can make a difference, tha...
This highlight has been truncated due to consecutive passage length restrictions.
10%
Flag icon
Just as you can be anchored to a price, you can also be anchored to an entire way of thinking about something. In other words, it can be very difficult to convince you of a new idea when a contradictory idea is already entrenched in your thinking.
10%
Flag icon
The Structure of Scientific Revolutions, which popularized the paradigm shift model, describing how accepted scientific theories change over time.
10%
Flag icon
Instead of a gradual, evolving progression, Kuhn describes a bumpy, messy process in which initial problems with a scientific theory are either ignored or rationalized away.
11%
Flag icon
“A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it,
11%
Flag icon
suggested explanations were not in line with the conventional thinking of the time. Today, this is known as a Semmelweis reflex
11%
Flag icon
Individuals still hang on to old theories in the face of seemingly overwhelming evidence—it happens all the time in science and in life in general. The human tendency to gather and interpret new information in a biased way to confirm preexisting beliefs is called confirmation bias.
11%
Flag icon
The human tendency to gather and interpret new information in a biased way to confirm preexisting belie...
This highlight has been truncated due to consecutive passage length restrictions.
11%
Flag icon
There is a reason why many scientific breakthroughs are discovered by outsiders to the field. There is a reason why “fresh eyes” and “outside the box” are clichés. The reason is because outsiders aren’t rooted in existing paradigms. Their reputations aren’t at stake if they question the status quo. They are by definition “free thinkers” because they are free to think without these constraints.
11%
Flag icon
Confirmation bias is so hard to overcome that there is a related model called the backfire effect that describes the phenomenon of digging in further on a position when faced with clear evidence that disproves it. In other words, it often backfires when people try to change your mind with facts and figures, having the opposite effect on you than it should; you become more entrenched in the original, incorrect position, not less.
11%
Flag icon
The pernicious effects of confirmation bias and related models can be explained by cognitive dissonance, the stress felt by holding two contradictory, dissonant, beliefs at once. Scientists have actually linked cognitive dissonance to a physical area in the brain that plays a role in helping you avoid aversive outcomes. Instead of dealing with the underlying cause of this stress—the fact that we might actually be wrong—we take the easy way out and rationalize the conflicting information away. It’s a survival instinct!
11%
Flag icon
real trick to being wrong less is to fight your instincts to dismiss new information and instead to embrace new ways of thinking and new paradigms.
11%
Flag icon
First, consider thinking gray, a concept we learned from Steven Sample’s book The Contrarian’s Guide to Leadership. You may think about issues in terms of black and white, but the truth is somewhere in between, a shade of gray. As Sample puts it:
11%
Flag icon
Most people are binary and instant in their judgments; that is, they immediately categorize things as good or bad, true or false, black or white, friend or foe. A truly effective leader, however, needs to be able to see the shades of gray inherent in a situation in order to make wise decisions as to how to proceed. The essence of thinking gray is this: don’t form an opinion about an important matter until you’ve heard all the relevant facts and arguments, or until circumstances force you to form an opinion without recourse to all the facts
12%
Flag icon
By delaying decision making, you avoid confirmation bias since you haven’t yet made a decision to confirm! It can be difficult to think gray because all the nuance and different points of view can cause cognitive dissonance. However, it is worth fighting through that dissonance to get closer to the objective truth.
12%
Flag icon
playing the Devil’s advocate means taking up an opposing side of an argument, even if it is one you don’t agree with.
12%
Flag icon
One approach is to force yourself literally to write down different cases for a given decision or appoint different members in a group to do so. Another, more effective approach is to proactively include people in a decision-making process who are known to hold opposing viewpoints.
12%
Flag icon
never allow myself to have an opinion on anything that I don’t know the other side’s argument better than they do.
12%
Flag icon
argues that when you do something frequently, it gradually gets encoded in your brain until at some point your intuition, via your fast thinking, takes over most of the time and you can do the task mindlessly: driving on the highway, doing simple arithmetic, saying your name. However, when you are in uncertain situations where you do not have encoded knowledge, you must use your slower thinking: driving on new roads, doing complex math, digging into your memory to recall someone you used to know. These are not mindless tasks.
12%
Flag icon
Following your intuition alone at times like these can cause you to fall prey to anchoring, availability bias, framing, and other pitfalls. Getting physically lost often starts with you thinking you intuitively know where to go and ends with the realization that your intuition failed you.
12%
Flag icon
intuition can help guide you to the right answer much more quickly. For example, the more you work with mental models, the more your intuition about which one to use in a given situation will be right, and the faster you will get to better decisions working with these models.
12%
Flag icon
One way to accelerate building up useful intuition like this is to try consistently to argue from first principles. Another is to take every opportunity you can to figure out what is actually causing things to happen.
12%
Flag icon
When something happens, the proximate cause is the thing that immediately caused it to happen.
12%
Flag icon
One technique commonly used in postmortems is called 5 Whys, where you keep asking the question “Why did that happen?” until you reach the root causes.
13%
Flag icon
Sometimes you may want something to be true so badly that you fool yourself into thinking it is likely to be true. This feeling is known as optimistic probability
13%
Flag icon
feeling is known as optimistic probability bias, because you are too optimistic about the probability of success.
13%
Flag icon
Root cause analysis, whether you use 5 Whys or some other framework, helps you cut through optimistic probability bias, forcing you to slow down your thinking, push through your intuition, and deliberately uncover the truth.
13%
Flag icon
investigating root causes, you are not just treating the symptoms but treating the underlying disease. We started this chapter explaining that to
13%
Flag icon
you need to both work at getting better over time (antifragile) and make fewer avoidable mistakes in your thinking (unforced errors).
13%
Flag icon
“You must not fool yourself—and you are the easiest person to fool.
13%
Flag icon
Unintended consequences are not a laughing matter under more serious circumstances.
14%
Flag icon
the tragedy of the commons arises from what is called the tyranny of small decisions, where a series of small, individually rational decisions ultimately leads to a system-wide negative consequence, or tyranny. It’s death by a thousand cuts.
14%
Flag icon
tyranny of small decisions can be avoided when someone who
14%
Flag icon
has a view over the whole system can veto or curb particular individual decisions when broad negative impacts can be foreseen.
14%
Flag icon
herd immunity. Diseases can spread only when they have an eligible host to infect. However, when the vast majority of people are vaccinated against a disease, there are very few eligible new hosts, since most people (in the herd) are immune from infection due to getting vaccinated. As a result, the overall public is less susceptible to outbreaks of the disease.
14%
Flag icon
externalities, which are consequences, good or bad, that affect an entity without its consent, imposed from an external source.
14%
Flag icon
Externalities occur wherever there are spillover effects, which happen when an effect of an activity spills over outside the core interactions of the activity.
15%
Flag icon
Internalizing is an attempt to require the entity that causes the negative externality to pay for it.
15%
Flag icon
There are many ways to internalize negative externalities, including taxes, fines, regulation, and lawsuits. Smoking externalities are internalized via cigarette taxes and higher health insurance premiums for smokers. Traffic congestion externalities are internalized through tolls. On a personal level, your neighbor might file a noise complaint against you if you consistently play music too loud.
15%
Flag icon
internalize negative externalities, including taxes, fines, regulation, and lawsuits.
15%
Flag icon
Coase theorem, essentially a description of how a natural marketplace can internalize a negative externality. Coase showed that an externality can be internalized efficiently without further need for intervention (that is, without a government or other authority regulating the externality) if the following conditions are met:
15%
Flag icon
if the following conditions are met: Well-defined property rights Rational actors Low transaction costs When these conditions are met, entities surrounding