More on this book
Community
Kindle Notes & Highlights
by
Annie Duke
Read between
July 23 - July 23, 2019
As Daniel Kahneman pointed out, we just want to think well of ourselves and feel that the narrative of our life story is a positive one. Being wrong doesn’t fit into that narrative. If we think of beliefs as only 100% right or 100% wrong, when confronting new information that might contradict our belief, we have only two options: (a) make the massive shift in our opinion of ourselves from 100% right to 100% wrong, or (b) ignore or discredit the new information. It feels bad to be wrong, so we choose (b). Information that disagrees with us is an assault on our self-narrative. We’ll work hard to
...more
when additional information agrees with us, we effortlessly embrace it.
Let me give you a different intuitive frame: the smarter you are, the better you are at constructing a narrative that supports your beliefs, rationalizing and framing the data to fit your argument or point of view. After all, people in the “spin room” in a political setting are generally pretty smart for a reason.
It turns out the better you are with numbers, the better you are at spinning those numbers to conform to and support your beliefs.
Unfortunately, this is just the way evolution built us. We are wired to protect our beliefs even when our goal is to truthseek. This is one of those instances where being smart and aware of our capacity for irrationality alone doesn’t help us refrain from biased reasoning.
that learning occurs when you get lots of feedback tied closely in time to decisions and actions.
“Self-serving bias” is the term for this pattern of fielding outcomes. Psychologist Fritz Heider was a pioneer in studying how people make luck and skill attributions about the results of their behavior.
Taking credit for the good stuff means we will often reinforce decisions that shouldn’t be reinforced and miss opportunities to see where we could have done better.
We see this pattern of blaming others for bad outcomes and failing to give them credit for good ones all over the place. When someone else at work gets a promotion instead of us, do we admit they worked harder than and deserved it more than we did? No, it was because they schmoozed the boss. If someone does better on a test at school, it was because the teacher likes them more. If someone explains the circumstances of a car accident and how it wasn’t their fault, we roll our eyes. We assume the cause was their bad driving.
My biased assessment of why they were winning slowed my learning down considerably. I missed out on a lot of opportunities to make money because I dismissed other players as lucky when I might have been learning from watching them. To be sure, some of those people shouldn’t have been playing those hands and were actually playing poorly. But, as I figured out almost a year into playing, not all of them.
Ideally, our happiness would depend on how things turn out for us regardless of how things turn out for anyone else. Yet, on a fundamental level, fielding someone’s bad outcome as their fault feels good to us. On a fundamental level, fielding someone’s good outcome as luck helps our narrative along.
What accounts for most of the variance in happiness is how we’re doing comparatively.
A lot of the way we feel about ourselves comes from how we think we compare with others. This robust and pervasive habit of mind impedes learning.
For him, the opportunity to learn from his mistakes was much more important than treating that dinner as a self-satisfying celebration. He earned a half-million dollars and won a lengthy poker tournament over world-class competition, but all he wanted to do was discuss with a fellow pro where he might have made better decisions.
Habits operate in a neurological loop consisting of three parts: the cue, the routine, and the reward. A habit could involve eating cookies: the cue might be hunger, the routine going to the pantry and grabbing a cookie, and the reward a sugar high. Or, in poker, the cue might be winning a hand, the routine taking credit for it, the reward a boost to our ego. Charles Duhigg, in The Power of Habit, offers the golden rule of habit change—that the best way to deal with a habit is to respect the habit loop: “To change a habit, you must keep the old cue, and deliver the old reward, but insert a new
...more
The golden rule of habit change says we don’t have to give up the reward of a positive update to our narrative, nor should we. Duhigg recognizes that respecting the habit loop means respecting the way our brain is built.
Our brain is built to seek positive self-image updates. It is also built to view ourselves in competition with our peers. We can’t install new hardware. Working with the way our brains are built in reshaping habit has a higher chance of success than working against it. Better to change the part that is more plastic: the routine of what gives us the good feeling in our narrative and the features by which we compare ourselves to others.
We can get to this mindset shift by behaving as if we have something at risk when we sort outcomes into the luck and skill buckets, because we do have a lot at risk on that fielding decision.
The key is that in explicitly recognizing that the way we field an outcome is a bet, we consider a greater number of alternative causes more seriously than we otherwise would have. That is truthseeking.
When we treat outcome fielding as a bet, it pushes us to field outcomes more objectively into the appropriate buckets because that is how bets are won.
Thinking in bets triggers a more open-minded exploration of alternative hypotheses, of reasons supporting conclusions opposite to the routine of self-serving bias.
We are more likely to explore the opposite side of an argument more often and more seriously—and that will move us closer to the truth of the matter.
A good strategy for figuring out which way to bet would be to imagine if that outcome had happened to us.
let’s spare a little of the self-congratulations and, instead, examine that great result the way we’d examine it if it happened to someone else. We’ll be more likely to find the things we could have done even better and identify those factors that we had no control
over. Perspective taking gets us closer to the truth because that truth generally lies in the middle of the way we field outcomes for ourselves and the way we field them for others.
Identifying a negative outcome doesn’t have the same personal sting if you turn it into a positive by finding things to learn from it. You don’t have to be on the defensive side of every negative outcome because you can recognize, in addition to things you can improve, things you did well and things outside your control. You realize that not knowing is okay.
Duhigg tells us that reshaping a habit requires time, preparation, practice, and repetition.
I have to identify the habit I want to change, figure out the routine to substitute, and practice that routine in deliberative mind until the habit is reshaped.
Any improvement in our decision quality puts us in a better position in the future. Think of it like a ship sailing from New York to London. If the ship’s navigator introduces a one-degree navigation error, it would start off as barely noticeable. Unchecked, however, the ship would veer farther and farther off course and would miss London by miles, as that one-degree miscalculation compounds mile over mile. Thinking in bets corrects your course. And even a small correction will get you more safely to your destination.
In the classic science-fiction film The Matrix, when Neo (played by Keanu Reeves) meets Morpheus (the hero-hacker played by Laurence Fishburne), Neo asks Morpheus to tell him what “the matrix” is. Morpheus offers to show Neo, giving him the choice between taking a blue pill and a red pill. “You take the blue pill, the story ends. You wake up in your bed and believe whatever you want to believe. You take the red pill, you stay in Wonderland and I show you how deep the rabbit hole goes.” As Neo reaches toward a pill, Morpheus reminds him, “Remember, all I am offering is the truth. Nothing more.”
Members of our decision pod could be our friends, or members of our family, or an informal pod of coworkers, or an enterprise strategy group, or a professional organization where members can talk about their decision-making. Forming or joining a group where the focus is on thinking in bets means modifying the usual social contract. It means agreeing to be open-minded to those who disagree with us, giving credit where it’s due, and taking responsibility where it’s appropriate, even (and especially) when it makes us uncomfortable. That’s why, when we do it with others, we need to make it clear
...more
In fact, as long as there are three people in the group (two to disagree and one to referee*), the truthseeking group can be stable and productive.
In other words, confirmatory thought amplifies bias, promoting and encouraging motivated reasoning because its main purpose is justification. Confirmatory thought promotes a love and celebration of one’s own beliefs, distorting how the group processes information and works through decisions, the result of which can be groupthink.
Exploratory thought, on the other hand, encourages an open-minded and objective consideration of alternative hypotheses and a tolerance of dissent to combat bias. Exploratory thought helps the members of a group reason toward a more accurate representation of the world.
This is why it’s so important to have intellectual and ideological diversity within any group or institution whose goal is to find truth.”
We don’t win bets by being in love with our own ideas. We win bets by relentlessly striving to calibrate our beliefs and predictions about the future to more accurately represent the world.
He encouraged me to find things I might have control over and how to improve decisions about those.
that our craving for approval is incredibly strong and incentivizing.
Motivated reasoning and self-serving bias are two habits of mind that are deeply rooted in how our brains work. We have a huge investment in confirmatory thought, and we fall into these biases all the time without even knowing it. Confirmatory thought is hard to spot, hard to change, and, if we do try changing it, hard to self-reinforce.
Identifying mistakes in hands I won reinforced the separation between outcomes and decision quality.
A diverse group can do some of the heavy lifting of de-biasing for us.
Dissent channels and red teams are a beautiful implementation of Mill’s bedrock principle that we can’t know the truth of a matter without hearing the other side. This commitment to diversity of opinion is something that we would be wise to apply to our own decision groups.
the Heterodox Academy effort shows that there is a natural drift toward homogeneity and confirmatory thought. We all experience this gravitation toward people who think like we do. Scientists, overwhelmingly trained and chartered toward truthseeking, aren’t immune.
We see that even judges and scientists succumb to these biases. We shouldn’t feel bad, whatever our situation, about admitting that we also need help.
groups with diverse viewpoints are the best protection against confirmatory thought.
check your Twitter feed for whom you follow. It’s a pretty safe bet that the bulk of them are ideologically aligned with you. If that’s the case, start following some people from the other side of the aisle.
surprised to learn that the expert opinion expressed as a bet was more accurate than expert opinion expressed through peer review, since peer review is considered a rock-solid foundation of the scientific method.
don’t disparage or ignore an idea just because you don’t like who or where it came from.
If we want to engage someone with whom we have some disagreement (inside or outside our group), they will be more open and less defensive if we start with those areas of agreement, which there surely will be. It is rare that we disagree with everything that someone
has to say.

