More on this book
Community
Kindle Notes & Highlights
by
Annie Duke
Read between
April 21 - April 30, 2023
Self-serving bias is a deeply embedded and robust thinking pattern. Understanding why this pattern emerges is the first step to developing practical strategies to improve our ability to learn from experience. These strategies encourage us to be more rational in the way we field outcomes, fostering open-mindedness in considering all the possible causes of an outcome, not just the ones that flatter us.
Black-and-white thinking, uncolored by the reality of uncertainty, is a driver of both motivated reasoning and self-serving bias. If our only options are being 100% right or 100% wrong, with nothing in between, then information that potentially contradicts a belief requires a total downgrade, from right all the way to wrong. There is no “somewhat less sure” option in an all-or-nothing world, so we ignore or discredit the information to hold steadfast in our belief.
Outcomes are rarely the result of our decision quality alone or chance alone, and outcome quality is not a perfect indicator of the influence of luck or skill. When it comes to self-serving bias, we act as if our good outcomes are perfectly correlated to good skill and our bad outcomes are perfectly correlated to bad luck.* Whether it is a poker hand, an auto accident, a football call, a trial outcome, or a business success, there are elements of luck and skill in virtually any outcome.
moribund
Taking credit for a win lifts our personal narrative. So too does knocking down a peer by finding them at fault for a loss. That’s schadenfreude: deriving pleasure from someone else’s misfortune. Schadenfreude is basically the opposite of compassion.
We are really in competition for resources with everyone. Our genes are competitive. As Richard Dawkins points out, natural selection proceeds by competition among the phenotypes of genes so we literally evolved to compete, a drive that allowed our species to survive. Engaging the world through the lens of competition is deeply embedded in our animal brains. It’s not enough to boost our self-image solely by our own successes. If someone we view as a peer is winning, we feel like we’re losing by comparison. We benchmark ourselves to them.
Once we start listening for it, we hear a chorus out in the world like I heard during breaks in poker tournaments: “things are going great because I’m making such good decisions”; “things went poorly because I got so unlucky.” That’s what the lawyer heard from his senior partner in each evening’s postmortem of the trial. That’s what we heard from Chris Christie in the 2016 Republican presidential debate. That’s what I heard in every poker room I was ever in.
The benefits of recognizing just a few extra learning opportunities compound over time. The cumulative effect of being a little better at decision-making, like compounding interest, can have huge effects in the long run on everything that we do.
Such interactions are reminders that not all situations are appropriate for truthseeking, nor are all people interested in the pursuit.
“You take the blue pill, the story ends. You wake up in your bed and believe whatever you want to believe. You take the red pill, you stay in Wonderland and I show you how deep the rabbit hole goes.” As Neo reaches toward a pill, Morpheus reminds him, “Remember, all I am offering is the truth. Nothing more.”
In the movie, the matrix was built to be a more comfortable version of the world. Our brains, likewise, have evolved to make our version of the world more comfortable: our beliefs are nearly always correct; favorable outcomes are the result of our skill; there are plausible reasons why unfavorable outcomes are beyond our control; and we compare favorably with our peers. We deny or at least dilute the most painful parts of the message.
Giving that up is not the easiest choice. Living in the matrix is comfortable. So is the natural way we process information to protect our self-image in the moment. By choosing to exit the matrix, we are asserting that striving for a more objective representation of the world, even if it is uncomfortable at times, will make us happier and more successful in the long run.
if we can find a few people to choose to form a truthseeking pod with us and help us do the hard work connected with it, it will move the needle—just a little bit, but with improvements that accumulate and compound over time. We will be more successful in fighting bias, seeing the world more objectively, and, as a result, we will make better decisions. Doing it on our own is just harder.
confirmatory thought amplifies bias, promoting and encouraging motivated reasoning because its main purpose is justification. Confirmatory thought promotes a love and celebration of one’s own beliefs, distorting how the group processes information and works through decisions, the result of which can be groupthink. Exploratory thought, on the other hand, encourages an open-minded and objective consideration of alternative hypotheses and a tolerance of dissent to combat bias. Exploratory thought helps the members of a group reason toward a more accurate representation of the world.
The expression “echo chamber” instantly conjures up the image of what results from our natural drift toward confirmatory thought.
Motivated reasoning and self-serving bias are two habits of mind that are deeply rooted in how our brains work. We have a huge investment in confirmatory thought, and we fall into these biases all the time without even knowing it. Confirmatory thought is hard to spot, hard to change, and, if we do try changing it, hard to self-reinforce. It is one thing to commit to rewarding ourselves for thinking in bets, but it is a lot easier if we get others to do the work of rewarding us.
“The only way in which a human being can make some approach to knowing the whole of a subject, is by hearing what can be said about it by persons of every variety of opinion, and studying all modes in which it can be looked at by every character of mind. No wise man ever acquired his wisdom in any mode but this; nor is it in the nature of human intellect to become wise in any other manner.”
To get a more objective view of the world, we need an environment that exposes us to alternate hypotheses and different perspectives. That doesn’t apply only to the world around us: to view ourselves in a more realistic way, we need other people to fill in our blind spots.
Why might my belief not be true? What other evidence might be out there bearing on my belief? Are there similar areas I can look toward to gauge whether similar beliefs to mine are true? What sources of information could I have missed or minimized on the way to reaching my belief? What are the reasons someone else could have a different belief, what’s their support, and why might they be right instead of me? What other perspectives are there as to why things turned out the way they did? Just by asking ourselves these questions, we are taking a big step toward calibration. But there is only so
...more
CUDOS stands for Communism (data belong to the group), Universalism (apply uniform standards to claims and evidence, regardless of where they came from), Disinterestedness (vigilance against potential conflicts that can influence the group’s evaluation), and Organized Skepticism (discussion among the group to encourage engagement and dissent). If you want to pick a role model for designing a group’s practical rules of engagement, you can’t do better than Merton.
Organized skepticism invites people into a cooperative exploration. People are more open to hearing differing perspectives expressed this way. Skepticism should be encouraged and, where possible, operationalized. The term “devil’s advocate” developed centuries ago from the Catholic Church’s practice, during the canonization process, of hiring someone to present arguments against sainthood.
Make no mistake: the process of seeing ourselves and the world more accurately and objectively is hard and makes us think about things we generally avoid. The group needs rules of engagement that don’t make this harder by letting members get away with being nasty or dismissive. And we need to be aware that even a softer serve of dissent to those who have not agreed to the truthseeking charter can be perceived as confrontational.
First, express uncertainty. Uncertainty not only improves truthseeking within groups but also invites everyone around us to share helpful information and dissenting opinions. Fear of being wrong (or of having to suggest someone else is wrong) countervails the social contract of confirmation, often causing people to withhold valuable insights and opinions from us.
Second, lead with assent. For example, listen for the things you agree with, state those and be specific, and then follow with “and” instead of “but.” If there is one thing we have learned thus far it is that we like having our ideas affirmed. If we want to engage someone with whom we have some disagreement (inside or outside our group), they will be more open and less defensive if we start with those areas of agreement, which there surely will be.
BUT . . . ,” that challenge puts people on the defensive. “And” is an offer to contribute. “But” is a denial and repudiation of what came before.
Third, ask for a temporary agreement to engage in truthseeking. If someone is off-loading emotion to us, we can ask them if they are just looking to vent or if they are looking for advice. If they aren’t looking for advice, that’s fine. The rules of engagement have been made clear. Sometimes, people just want to vent. I certainly do. It’s in our nature. We want to be supportive of the people around us, and that includes comforting them when they just need some understanding and sympathy. But sometimes they’ll say they are looking for advice, and that is potentially an agreement to opt in to
...more
Finally, focus on the future. As I said at the beginning of this book, we are generally pretty good at identifying the positive goals we are striving for; our problem is in the execution of the decisions along the way to reaching those goals. People dislike engaging with their poor execution.
When we make in-the-moment decisions (and don’t ponder the past or future), we are more likely to be irrational and impulsive.* This tendency we all have to favor our present-self at the expense of our future-self is called temporal discounting.
When we think about the past and the future, we engage deliberative mind, improving our ability to make a more rational decision. When we imagine the future, we don’t just make it up out of whole cloth, inventing a future based on nothing that we have ever seen or experienced. Our vision of the future, rather, is rooted in our memories of the past. The future we imagine is a novel reassembling of our past experiences.
Thinking about the future is remembering the future, putting memories together in a creative way to imagine a possible way things might turn out.
One of our time-travel goals is to create moments like that, where we can interrupt an in-the-moment decision and take some time to consider the decision from the perspective of our past and future. We can then create a habit routine around these decision interrupts to encourage this perspective taking, asking ourselves a set of simple questions at the moment of the decision designed to get future-us and past-us involved. We can do this by imagining how future-us is likely to feel about the decision or by imagining how we might feel about the decision today if past-us had made it. The
...more
Business journalist and author Suzy Welch developed a popular tool known as 10-10-10 that has the effect of bringing future-us into more of our in-the-moment decisions. “Every 10-10-10 process starts with a question. . . . [W]hat are the consequences of each of my options in ten minutes? In ten months? In ten years?” This set of questions triggers mental time travel that cues that accountability conversation
Signs of the illusion of certainty: “I know,” “I’m sure,” “I knew it,” “It always happens this way,” “I’m certain of it,” “you’re 100% wrong,” “You have no idea what you’re talking about,” “There’s no way that’s true,” “0%” or “100%” or their equivalents, and other terms signaling that we’re presuming things are more certain than we know they are. This also includes stating things as absolutes, like “best” or “worst” and “always” or “never.”
The word “wrong,” which deserves its own swear jar. The Mertonian norm of organized skepticism allows little place in exploratory discussion for the word “wrong.” “Wrong” is a conclusion, not a rationale. And it’s not a particularly accurate conclusion since, as we know, nearly nothing is 100% or 0%. Any words or thoughts denying the existence of uncertainty should be a signal that we are heading toward a poorly calibrated decision.
For us to make better decisions, we need to perform reconnaissance on the future. If a decision is a bet on a particular future based on our beliefs, then before we place a bet we should consider in detail what those possible futures might look like.
scouting various futures has numerous additional benefits. First, scenario planning reminds us that the future is inherently uncertain. By making that explicit in our decision-making process, we have a more realistic view of the world. Second, we are better prepared for how we are going to respond to different outcomes that might result from our initial decision. We can anticipate positive or negative developments and plan our strategy, rather than being reactive. Being able to respond to the changing future is a good thing; being surprised by the changing future is not.
Imagining the future recruits the same brain pathways as remembering the past. And it turns out that remembering the future is a better way to plan for it. From the vantage point of the present, it’s hard to see past the next step. We end up over-planning for addressing problems we have right now. Implicit in that approach is the assumption that conditions will remain the same, facts won’t change, and the paradigm will remain stable. The world changes too fast to assume that approach is generally valid.
The most common form of working backward from our goal to map out the future is known as backcasting. In backcasting, we imagine we’ve already achieved a positive outcome, holding up a newspaper with the headline “We Achieved Our Goal!” Then we think about how we got there.
Imagining a successful future and backcasting from there is a useful time-travel exercise for identifying necessary steps for reaching our goals. Working backward helps even more when we give ourselves the freedom to imagine an unfavorable future.
A premortem is an investigation into something awful, but before it happens. We all like to bask in an optimistic view of the future. We generally are biased to overestimate the probability of good things happening. Looking at the world through rose-colored glasses is natural and feels good, but a little naysaying goes a long way.
Backcasting and premortems complement each other. Backcasting imagines a positive future; a premortem imagines a negative future. We can’t create a complete picture without representing both the positive space and the negative space. Backcasting reveals the positive space. Premortems reveal the negative space. Backcasting is the cheerleader; a premortem is the heckler in the audience.
Despite the popular wisdom that we achieve success through positive visualization, it turns out that incorporating negative visualization makes us more likely to achieve our goals.
when we backcast and imagine the things that went right, we reveal the problems if those things didn’t go right. Backcasting doesn’t, therefore, ignore the negative space so much as it overrepresents the positive space. It’s in our optimistic nature (and natural in backcasting) to imagine a successful future. Without a premortem, we don’t see as many paths to the future in which we don’t reach our goals. A premortem forces us to build out that side of the tree where things don’t work out. In the process, we are likely to realize that’s a pretty robust part of the tree.
That’s what hindsight bias is, and we’re all running amok through the forest with a chainsaw once we get an outcome. Once something occurs, we no longer think of it as probabilistic—or as ever having been probabilistic. This is how we get into the frame of mind where we say, “I should have known” or “I told you so.” This is where unproductive regret comes from. By keeping an accurate representation of what could have happened (and not a version edited by hindsight), memorializing the scenario plans and decision trees we create through good planning process, we can be better calibrators going
...more