Super Thinking: The Big Book of Mental Models
Rate it:
Open Preview
Kindle Notes & Highlights
Read between June 26 - July 6, 2019
6%
Flag icon
Hanlon’s razor seeks out the simplest explanation. And when people do something harmful, the simplest explanation is usually that they took the path of least resistance.
6%
Flag icon
That is, they carelessly created the negative outcome; they did not cause the...
This highlight has been truncated due to consecutive passage length restrictions.
6%
Flag icon
fundamental attribution error, where you frequently make errors by attributing others’ behaviors to their internal, or fundamental, motivations rather than external factors.
6%
Flag icon
You of course tend to view your own behavior in the opposite way, which is called self-serving bias.
6%
Flag icon
When you are the actor, you often have self-serving reasons for your behavior, but when you are the observer, you tend to blame the other’s intrinsic nature.
6%
Flag icon
actor-observ...
This highlight has been truncated due to consecutive passage length restrictions.
7%
Flag icon
Another tactical model to help you have greater empathy is the veil of ignorance, put forth by philosopher John Rawls.
7%
Flag icon
It holds that when thinking about how society should be organized, we should do so by imagining ourselves ignorant of our particular place in the world, as if there were a veil preventing us from knowing who we are. Rawls refers to this as the “original position.”
7%
Flag icon
You must consider the possibility that you might have been born a slave, and how that would feel. Or, when considering policies regarding refugees, you must consider the possibility that you could have been one of those seeking refuge.
7%
Flag icon
The veil of ignorance encourages you to empathize with people across a variety of circumstances, so that you can make better moral judgments.
7%
Flag icon
As a manager, it may be easy to imagine changing the policy from your perspective, especially if you personally do not highly value remote working. The veil of ignorance, though, pushes you to imagine the change from the original position, where you could be any employee.
7%
Flag icon
but putting on the veil of ignorance helps you appreciate the challenges this might pose for your staff and might even help you come up with creative alternatives.
7%
Flag icon
Speaking of privilege, we (the authors) often say we are lucky to have won the birth lottery. Not only were we not born into slavery, but we were also not born into almost any disadvantaged group.
7%
Flag icon
Victims of circumstance are actually blamed for their circumstances, with no accounting for factors of randomness like the birth lottery.
7%
Flag icon
You should also keep in mind that the model of learned helplessness can make it hard for some people to strive for improvement without some assistance.
7%
Flag icon
Learned helplessness describes the tendency to stop trying to escape difficult situations because we have gotten used to difficult conditions over time. Someone learns that they are helpless to control their circumstances, so they give up trying to change them.
7%
Flag icon
Learned helplessness can be overcome when animals or people see that their actions can make a difference, that they aren’t actually helpless.
7%
Flag icon
Learned helplessness is not found only in dire situations. People can also exhibit learned helplessness in everyday circumstances, believing they are incapable of doing or learning certain things, such as public speaking or using new technologies. In each of these cases, though, they are probably capable of improving their area of weakness if guided by the right mentor,
7%
Flag icon
You don’t want to make a fundamental attribution error by assuming that your colleague is incapable of doing something when they really just need the proper guidance.
7%
Flag icon
When applying them, you are effectively trying to understand people’s actual circumstances and motivations better, trying as best you can to walk a mile in their shoes.
7%
Flag icon
PROGRESS, ONE FUNERAL AT A TIME
Van Tran
.h2
7%
Flag icon
Just as you can be anchored to a price, you can also be anchored to an entire way of thinking about something. In other words, it can be very difficult to convince you of a new idea when a contr...
This highlight has been truncated due to consecutive passage length restrictions.
7%
Flag icon
The Structure of Scientific Revolutions,
7%
Flag icon
the paradigm shift model, describing how accepted scientific theories change over time.
7%
Flag icon
“A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it,” or, more succinctly, “Science progresses one funeral at a time.”
8%
Flag icon
However, they both noticed obvious and important empirical truths that should have been investigated by other scientists but were reflexively rejected by these scientists because the suggested explanations were not in line with the conventional thinking of the time.
8%
Flag icon
Today, this is known as a Semmelweis reflex.
8%
Flag icon
The human tendency to gather and interpret new information in a biased way to confirm preexisting beliefs is called confirmation bias.
8%
Flag icon
There is a reason why many startup companies that disrupt industries are founded by industry outsiders. There is a reason why many scientific breakthroughs are discovered by outsiders to the field.
8%
Flag icon
The reason is because outsiders aren’t rooted in existing paradigms.
8%
Flag icon
They are by definition “free thinkers” because they are free to think wit...
This highlight has been truncated due to consecutive passage length restrictions.
8%
Flag icon
backfire effect that describes the phenomenon of digging in further on a position when faced with clear evidence that disproves it.
8%
Flag icon
You may also succumb to holding on to incorrect beliefs because of disconfirmation bias, where you impose a stronger burden of proof on the ideas you don’t want to believe.
8%
Flag icon
The pernicious effects of confirmation bias and related models can be explained by cognitive dissonance, the stress felt by holding two contradictory, dissonant, beliefs at once. Scientists have actually linked cognitive dissonance to a physical area in the brain that plays a role in helping you avoid aversive outcomes. Instead of dealing with the underlying cause of this stress—the fact that we might actually be wrong—we take the easy way out and rationalize the conflicting information away. It’s a survival instinct!
8%
Flag icon
A real trick to being wrong less is to fight your instincts to dismiss new information and instead to embrace new ways of thinking and new paradigms.
8%
Flag icon
First, consider thinking gray, a concept we learned from Steven Sample’s book The Contrarian’s Guide to Leadership. You may think about issues in terms of black and white, but the truth is somewhere in between, a shade of gray.
8%
Flag icon
The essence of thinking gray is this: don’t form an opinion about an important matter until you’ve heard all the relevant facts and arguments, or until circumstances force you to form an opinion without recourse to all the facts
8%
Flag icon
This model is powerful because it forces you to be patient. By delaying decision making, you avoid confirmation bias since you haven’t yet made a decision to confirm!
8%
Flag icon
A second mental model that can help you with confirmation bias is the Devil’s advocate position.
8%
Flag icon
More broadly, playing the Devil’s advocate means taking up an opposing side of an argument, even if it is one you don’t agree with. One approach is to force yourself literally to write down different cases for a given decision or appoint different members in a group to do so.
8%
Flag icon
DON’T TRUST YOUR GUT
Van Tran
.h2
9%
Flag icon
You can run into trouble when you blindly trust your gut in situations where it is unclear whether you should be thinking fast or slow. Following your intuition alone at times like these can cause you to fall prey to anchoring, availability bias, framing, and other pitfalls.
9%
Flag icon
To mountain lions, direct eye contact signals that you aren’t easy prey, and so they will hesitate to attack.
9%
Flag icon
To avoid mental traps, you must think more objectively. Try arguing from first principles, getting to root causes, and seeking out the third story. Realize that your intuitive interpretations of the world can often be wrong due to availability bias, fundamental attribution error, optimistic probability bias, and other related mental models that explain common errors in thinking. Use Ockham’s razor and Hanlon’s razor to begin investigating the simplest objective explanations. Then test your theories by de-risking your assumptions, avoiding premature optimization. Attempt to think gray in an ...more
Kim Yung liked this
10%
Flag icon
Anything That Can Go Wrong, Will
Van Tran
.h2
10%
Flag icon
ALL YOUR ACTIONS HAVE CONSEQUENCES, but sometimes those consequences are unexpected.
10%
Flag icon
HARM THY NEIGHBOR, UNINTENTIONALLY
10%
Flag icon
Any shared resource, or commons, is vulnerable to this tragedy.
10%
Flag icon
People make self-serving edits to Wikipedia articles, diminishing the overall reliability of the encyclopedia.
10%
Flag icon
They use the common resource for their own benefit at little or no cost