More on this book
Community
Kindle Notes & Highlights
However, it is very easy to be wrong about other people’s motivations. You may assume they share your perspective or context, think like you do, or have circumstances similar to yours. With such assumptions, you may conclude that they should also behave like you would or hold your beliefs. Unfortunately, often these assumptions are wrong.
In any conflict between two people, there are two sides of the story. Then there is the third story, the story that a third, impartial observer would recount.
Imagine a complete recording of the situation, and then try to think about what an outside audience would say was happening if they watched or listened to the recording.
Another tactical model that can help you empathize is the most respectful interpretation, or MRI.
MRI asks you to you interpret the other parties’ actions in the most respectful way possible. It’s giving people the benefit of the doubt.
MRI asks you to approach a situation from a perspective of respect. You remain open to other interpretations and withhold judgment until necessary.
called Hanlon’s razor: never attribute to malice that which is adequately explained by carelessness.
Hanlon’s razor says the person probably just didn’t take enough time and care in crafting their message.
The third story, most respectful interpretation, and Hanlon’s razor are all attempts to overcome what psychologists call the fundamental attribution error, where you frequently make errors by attributing others’ behaviors to their internal, or fundamental, motivations rather than external factors.
You of course tend to view your own behavior in the opposite way, which is called self-serving bias.
veil of ignorance, put forth by philosopher John Rawls. It holds that when thinking about how society should be organized, we should do so by imagining ourselves ignorant of our particular place in the world, as if there were a veil preventing us from knowing who we are. Rawls refers to this as the “original position.”
Speaking of privilege, we (the authors) often say we are lucky to have won the birth lottery. Not only were we not born into slavery, but we were also not born into almost any disadvantaged group.
can be challenging to acknowledge that a good portion of your success stems from luck.
just world hypothesis, where people always get what they deserve, good or bad, because of their actions alone, with no accounting for luck or randomness.
Ironically, belief in a just world can get in the way of actual justice by leading people to victim-blame: The sexual assault victim “should have worn different clothes” or the welfare recipient “is just lazy.”
The problem with the just world hypothesis and victim-blaming is that they make broad judgments about why things are happening to people that are often inaccurate at the individual level.
learned helplessness can make it hard for some people to strive for improvement without some assistance.
Someone learns that they are helpless to control their circumstances, so they give up trying to change them.
And the strategy actually saves on average eight thousand dollars per person in annual expenses, as the chronic homeless tend to use a lot of public resources, such as hospitals, jails, and shelters.
People can also exhibit learned helplessness in everyday circumstances, believing they are incapable of doing or learning certain things, such as public speaking or using new technologies.
You don’t want to make a fundamental attribution error by assuming that your colleague is incapable of doing something when they really just need the proper guidance.
All the mental models in this section—from the third story to learned helplessness—can help you increase your empathy.
In other words, it can be very difficult to convince you of a new idea when a contradictory idea is already entrenched in your thinking.
paradigm shift model, describing how accepted scientific theories change over time.
“A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it,”
“Science progresses one funeral at a time.”
the continents drift across the oceans.
Instead of helping to investigate Wegener’s theory (which certainly wasn’t perfect but had promise), geologists chose to hold on to this incorrect land bridge theory until the evidence for continental drift was so overwhelming that a paradigm shift occurred.
Semmelweis obsessed about this difference, painstakingly eliminating all variables until he was left with just one: doctors versus midwives. After studying doctor behavior, he concluded that it must be due to their handling of the cadavers and instituted a practice of washing hands with a solution of chlorinated lime. The death rate immediately dropped to match that in the other part of the hospital.
Like Wegener, Semmelweis didn’t fully understand the scientific mechanism that underpinned his theory and crafted an initial explanation that turned out to be somewhat incorrect.
Semmelweis reflex.
new information in a biased way to confirm preexisting beliefs is called confirmation bias.
There is a reason why many startup companies that disrupt industries are founded by industry outsiders.
backfire effect that describes the phenomenon of digging in further on a position when faced with clear evidence that disproves it. In other words, it often backfires when people try to change your mind with facts and figures, having the opposite effect on you than it should; you become more entrenched in the original, incorrect position, not less.
You may also succumb to holding on to incorrect beliefs because of disconfirmation bias,
When our bathroom scale delivers bad news, we hop off and then on again, just to make sure we didn’t misread the display or put too much pressure on one foot. When our scale delivers good news, we smile and head for the shower. By uncritically accepting evidence when it pleases us, and insisting on more when it doesn’t, we subtly tip the scales in our favor.
cognitive dissonance
Instead of dealing with the underlying cause of this stress—the fact that we might actually be wrong—we take the easy way out and rationalize the conflicting information away. It’s a survival instinct!
A real trick to being wrong less is to fight your instincts to dismiss new information and instead to embrace new ways of thinking and new paradigms.
First, consider thinking gray,
The Contrarian’s Guide to Leadership.
A truly effective leader, however, needs to be able to see the shades of gray inherent in a situation in order to make wise decisions as to how to proceed.
don’t form an opinion about an important matter until you’ve heard all the relevant facts and arguments, or until circumstances force you to form an opinion without recourse to all the facts (which happens occasionally, but much less frequently than one might imagine).
once described something similar to thinking gray when he observed that the test of a first-rate mind is the ability to hold two opposing thoughts at the same time while still retaining the ability to function.
By delaying decision making, you avoid confirmation bias since you haven’t yet made a decision to confirm!
Devil’s advocate position.
Once someone is canonized, the decision is eternal, so it was critical to get it right. Hence this position was created for someone to advocate from the Devil’s point of view against the deceased person’s case for sainthood.
One approach is to force yourself literally to write down different cases for a given decision or appoint different members in a group to do
Another, more effective approach is to proactively include people in a decision-making process who are known to hold opposing viewpoints.
“I never allow myself to have an opinion on anything that I don’t know the other side’s argument better than they do.”

