More on this book
Community
Kindle Notes & Highlights
Read between
December 31, 2019 - May 22, 2020
You are guilty of the fundamental attribution error whenever you think someone was mean because she is mean rather than thinking she was just having a bad day.
You of course tend to view your own behavior in the opposite way, which is called self-serving bias. When you are the actor, you often have self-serving reasons for your behavior, but when you are the observer, you tend to blame the other’s intrinsic nature. (That’s why this model is also sometimes called actor-observer bias.) For example, if someone runs a red light, you often assume that person is inherently reckless; you do not consider that she might be rushing to the hospital for an ...
This highlight has been truncated due to consecutive passage length restrictions.
Another tactical model to help you have greater empathy is the veil of ignorance, put forth by philosopher John Rawls. It holds that when thinking about how society should be organized, we should do so by imagining ourselves ignorant of our particular place in the world, as if there were a veil prevent...
This highlight has been truncated due to consecutive passage length restrictions.
For example, you should not just consider your current position as a free person when contemplating a world where slavery is allowed. You must consider the possibility that you might have been born a slave, and how that would feel. Or, when considering policies regarding refugees, you must consider the possibility that you could have been one of those seeking refuge. The veil of ignorance encourages yo...
This highlight has been truncated due to consecutive passage length restrictions.
can be challenging to acknowledge that a good portion of your success stems from luck. Many people instead choose to believe that the world is completely fair, orderly, and predictable. This view is called the just world hypothesis,
be challenging to acknowledge that a good portion of your success stems from luck. Many people instead choose to believe that the world is completely fair, orderly, and predictable. This view is called the just world hypothesis, where people always get what they deserve, good or bad, because of their actions alone, with no accounting for luck or randomness. This view is summed up as you reap what you sow.
Learned helplessness describes the tendency to stop trying to escape difficult situations because we have gotten used to difficult conditions over time. Someone learns that they are helpless to control their circumstances, so they give up trying to change them. In a series of experiments summarized in “Learned Helplessness”
Annual Review of Medicine, psychologist Martin Seligman placed dogs in a box where they were repeatedly shocked at random intervals. Then he placed them in a similar box where they could easily escape the shocks. However, they did not actually try to escape; they simply lay down and waited for the shocks to stop. On the other hand, dogs who were not shocked would quickly jump out of the box.
Learned helplessness can be overcome when animals or people see that their actions can make a difference, tha...
This highlight has been truncated due to consecutive passage length restrictions.
Just as you can be anchored to a price, you can also be anchored to an entire way of thinking about something. In other words, it can be very difficult to convince you of a new idea when a contradictory idea is already entrenched in your thinking.
The Structure of Scientific Revolutions, which popularized the paradigm shift model, describing how accepted scientific theories change over time.
Instead of a gradual, evolving progression, Kuhn describes a bumpy, messy process in which initial problems with a scientific theory are either ignored or rationalized away.
“A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it,
suggested explanations were not in line with the conventional thinking of the time. Today, this is known as a Semmelweis reflex
Individuals still hang on to old theories in the face of seemingly overwhelming evidence—it happens all the time in science and in life in general. The human tendency to gather and interpret new information in a biased way to confirm preexisting beliefs is called confirmation bias.
The human tendency to gather and interpret new information in a biased way to confirm preexisting belie...
This highlight has been truncated due to consecutive passage length restrictions.
There is a reason why many scientific breakthroughs are discovered by outsiders to the field. There is a reason why “fresh eyes” and “outside the box” are clichés. The reason is because outsiders aren’t rooted in existing paradigms. Their reputations aren’t at stake if they question the status quo. They are by definition “free thinkers” because they are free to think without these constraints.
Confirmation bias is so hard to overcome that there is a related model called the backfire effect that describes the phenomenon of digging in further on a position when faced with clear evidence that disproves it. In other words, it often backfires when people try to change your mind with facts and figures, having the opposite effect on you than it should; you become more entrenched in the original, incorrect position, not less.
The pernicious effects of confirmation bias and related models can be explained by cognitive dissonance, the stress felt by holding two contradictory, dissonant, beliefs at once. Scientists have actually linked cognitive dissonance to a physical area in the brain that plays a role in helping you avoid aversive outcomes. Instead of dealing with the underlying cause of this stress—the fact that we might actually be wrong—we take the easy way out and rationalize the conflicting information away. It’s a survival instinct!
real trick to being wrong less is to fight your instincts to dismiss new information and instead to embrace new ways of thinking and new paradigms.
First, consider thinking gray, a concept we learned from Steven Sample’s book The Contrarian’s Guide to Leadership. You may think about issues in terms of black and white, but the truth is somewhere in between, a shade of gray. As Sample puts it:
Most people are binary and instant in their judgments; that is, they immediately categorize things as good or bad, true or false, black or white, friend or foe. A truly effective leader, however, needs to be able to see the shades of gray inherent in a situation in order to make wise decisions as to how to proceed. The essence of thinking gray is this: don’t form an opinion about an important matter until you’ve heard all the relevant facts and arguments, or until circumstances force you to form an opinion without recourse to all the facts
By delaying decision making, you avoid confirmation bias since you haven’t yet made a decision to confirm! It can be difficult to think gray because all the nuance and different points of view can cause cognitive dissonance. However, it is worth fighting through that dissonance to get closer to the objective truth.
playing the Devil’s advocate means taking up an opposing side of an argument, even if it is one you don’t agree with.
One approach is to force yourself literally to write down different cases for a given decision or appoint different members in a group to do so. Another, more effective approach is to proactively include people in a decision-making process who are known to hold opposing viewpoints.
never allow myself to have an opinion on anything that I don’t know the other side’s argument better than they do.
argues that when you do something frequently, it gradually gets encoded in your brain until at some point your intuition, via your fast thinking, takes over most of the time and you can do the task mindlessly: driving on the highway, doing simple arithmetic, saying your name. However, when you are in uncertain situations where you do not have encoded knowledge, you must use your slower thinking: driving on new roads, doing complex math, digging into your memory to recall someone you used to know. These are not mindless tasks.
Following your intuition alone at times like these can cause you to fall prey to anchoring, availability bias, framing, and other pitfalls. Getting physically lost often starts with you thinking you intuitively know where to go and ends with the realization that your intuition failed you.
intuition can help guide you to the right answer much more quickly. For example, the more you work with mental models, the more your intuition about which one to use in a given situation will be right, and the faster you will get to better decisions working with these models.
One way to accelerate building up useful intuition like this is to try consistently to argue from first principles. Another is to take every opportunity you can to figure out what is actually causing things to happen.
When something happens, the proximate cause is the thing that immediately caused it to happen.
One technique commonly used in postmortems is called 5 Whys, where you keep asking the question “Why did that happen?” until you reach the root causes.
Sometimes you may want something to be true so badly that you fool yourself into thinking it is likely to be true. This feeling is known as optimistic probability
feeling is known as optimistic probability bias, because you are too optimistic about the probability of success.
Root cause analysis, whether you use 5 Whys or some other framework, helps you cut through optimistic probability bias, forcing you to slow down your thinking, push through your intuition, and deliberately uncover the truth.
investigating root causes, you are not just treating the symptoms but treating the underlying disease. We started this chapter explaining that to
you need to both work at getting better over time (antifragile) and make fewer avoidable mistakes in your thinking (unforced errors).
“You must not fool yourself—and you are the easiest person to fool.
Unintended consequences are not a laughing matter under more serious circumstances.
the tragedy of the commons arises from what is called the tyranny of small decisions, where a series of small, individually rational decisions ultimately leads to a system-wide negative consequence, or tyranny. It’s death by a thousand cuts.
tyranny of small decisions can be avoided when someone who
has a view over the whole system can veto or curb particular individual decisions when broad negative impacts can be foreseen.
herd immunity. Diseases can spread only when they have an eligible host to infect. However, when the vast majority of people are vaccinated against a disease, there are very few eligible new hosts, since most people (in the herd) are immune from infection due to getting vaccinated. As a result, the overall public is less susceptible to outbreaks of the disease.
externalities, which are consequences, good or bad, that affect an entity without its consent, imposed from an external source.
Externalities occur wherever there are spillover effects, which happen when an effect of an activity spills over outside the core interactions of the activity.
Internalizing is an attempt to require the entity that causes the negative externality to pay for it.
There are many ways to internalize negative externalities, including taxes, fines, regulation, and lawsuits. Smoking externalities are internalized via cigarette taxes and higher health insurance premiums for smokers. Traffic congestion externalities are internalized through tolls. On a personal level, your neighbor might file a noise complaint against you if you consistently play music too loud.
internalize negative externalities, including taxes, fines, regulation, and lawsuits.
Coase theorem, essentially a description of how a natural marketplace can internalize a negative externality. Coase showed that an externality can be internalized efficiently without further need for intervention (that is, without a government or other authority regulating the externality) if the following conditions are met:
if the following conditions are met: Well-defined property rights Rational actors Low transaction costs When these conditions are met, entities surrounding