More on this book
Community
Kindle Notes & Highlights
Read between
June 26 - July 6, 2019
Hanlon’s razor seeks out the simplest explanation. And when people do something harmful, the simplest explanation is usually that they took the path of least resistance.
That is, they carelessly created the negative outcome; they did not cause the...
This highlight has been truncated due to consecutive passage length restrictions.
fundamental attribution error, where you frequently make errors by attributing others’ behaviors to their internal, or fundamental, motivations rather than external factors.
You of course tend to view your own behavior in the opposite way, which is called self-serving bias.
When you are the actor, you often have self-serving reasons for your behavior, but when you are the observer, you tend to blame the other’s intrinsic nature.
actor-observ...
This highlight has been truncated due to consecutive passage length restrictions.
Another tactical model to help you have greater empathy is the veil of ignorance, put forth by philosopher John Rawls.
It holds that when thinking about how society should be organized, we should do so by imagining ourselves ignorant of our particular place in the world, as if there were a veil preventing us from knowing who we are. Rawls refers to this as the “original position.”
You must consider the possibility that you might have been born a slave, and how that would feel. Or, when considering policies regarding refugees, you must consider the possibility that you could have been one of those seeking refuge.
The veil of ignorance encourages you to empathize with people across a variety of circumstances, so that you can make better moral judgments.
As a manager, it may be easy to imagine changing the policy from your perspective, especially if you personally do not highly value remote working. The veil of ignorance, though, pushes you to imagine the change from the original position, where you could be any employee.
but putting on the veil of ignorance helps you appreciate the challenges this might pose for your staff and might even help you come up with creative alternatives.
Speaking of privilege, we (the authors) often say we are lucky to have won the birth lottery. Not only were we not born into slavery, but we were also not born into almost any disadvantaged group.
Victims of circumstance are actually blamed for their circumstances, with no accounting for factors of randomness like the birth lottery.
You should also keep in mind that the model of learned helplessness can make it hard for some people to strive for improvement without some assistance.
Learned helplessness describes the tendency to stop trying to escape difficult situations because we have gotten used to difficult conditions over time. Someone learns that they are helpless to control their circumstances, so they give up trying to change them.
Learned helplessness can be overcome when animals or people see that their actions can make a difference, that they aren’t actually helpless.
Learned helplessness is not found only in dire situations. People can also exhibit learned helplessness in everyday circumstances, believing they are incapable of doing or learning certain things, such as public speaking or using new technologies. In each of these cases, though, they are probably capable of improving their area of weakness if guided by the right mentor,
You don’t want to make a fundamental attribution error by assuming that your colleague is incapable of doing something when they really just need the proper guidance.
When applying them, you are effectively trying to understand people’s actual circumstances and motivations better, trying as best you can to walk a mile in their shoes.
Just as you can be anchored to a price, you can also be anchored to an entire way of thinking about something. In other words, it can be very difficult to convince you of a new idea when a contr...
This highlight has been truncated due to consecutive passage length restrictions.
The Structure of Scientific Revolutions,
the paradigm shift model, describing how accepted scientific theories change over time.
“A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it,” or, more succinctly, “Science progresses one funeral at a time.”
However, they both noticed obvious and important empirical truths that should have been investigated by other scientists but were reflexively rejected by these scientists because the suggested explanations were not in line with the conventional thinking of the time.
Today, this is known as a Semmelweis reflex.
The human tendency to gather and interpret new information in a biased way to confirm preexisting beliefs is called confirmation bias.
There is a reason why many startup companies that disrupt industries are founded by industry outsiders. There is a reason why many scientific breakthroughs are discovered by outsiders to the field.
The reason is because outsiders aren’t rooted in existing paradigms.
They are by definition “free thinkers” because they are free to think wit...
This highlight has been truncated due to consecutive passage length restrictions.
backfire effect that describes the phenomenon of digging in further on a position when faced with clear evidence that disproves it.
You may also succumb to holding on to incorrect beliefs because of disconfirmation bias, where you impose a stronger burden of proof on the ideas you don’t want to believe.
The pernicious effects of confirmation bias and related models can be explained by cognitive dissonance, the stress felt by holding two contradictory, dissonant, beliefs at once. Scientists have actually linked cognitive dissonance to a physical area in the brain that plays a role in helping you avoid aversive outcomes. Instead of dealing with the underlying cause of this stress—the fact that we might actually be wrong—we take the easy way out and rationalize the conflicting information away. It’s a survival instinct!
A real trick to being wrong less is to fight your instincts to dismiss new information and instead to embrace new ways of thinking and new paradigms.
First, consider thinking gray, a concept we learned from Steven Sample’s book The Contrarian’s Guide to Leadership. You may think about issues in terms of black and white, but the truth is somewhere in between, a shade of gray.
The essence of thinking gray is this: don’t form an opinion about an important matter until you’ve heard all the relevant facts and arguments, or until circumstances force you to form an opinion without recourse to all the facts
This model is powerful because it forces you to be patient. By delaying decision making, you avoid confirmation bias since you haven’t yet made a decision to confirm!
A second mental model that can help you with confirmation bias is the Devil’s advocate position.
More broadly, playing the Devil’s advocate means taking up an opposing side of an argument, even if it is one you don’t agree with. One approach is to force yourself literally to write down different cases for a given decision or appoint different members in a group to do so.
You can run into trouble when you blindly trust your gut in situations where it is unclear whether you should be thinking fast or slow. Following your intuition alone at times like these can cause you to fall prey to anchoring, availability bias, framing, and other pitfalls.
To mountain lions, direct eye contact signals that you aren’t easy prey, and so they will hesitate to attack.
To avoid mental traps, you must think more objectively. Try arguing from first principles, getting to root causes, and seeking out the third story. Realize that your intuitive interpretations of the world can often be wrong due to availability bias, fundamental attribution error, optimistic probability bias, and other related mental models that explain common errors in thinking. Use Ockham’s razor and Hanlon’s razor to begin investigating the simplest objective explanations. Then test your theories by de-risking your assumptions, avoiding premature optimization. Attempt to think gray in an
...more
Kim Yung liked this
ALL YOUR ACTIONS HAVE CONSEQUENCES, but sometimes those consequences are unexpected.
HARM THY NEIGHBOR, UNINTENTIONALLY
Any shared resource, or commons, is vulnerable to this tragedy.
People make self-serving edits to Wikipedia articles, diminishing the overall reliability of the encyclopedia.
They use the common resource for their own benefit at little or no cost

