More on this book
Community
Kindle Notes & Highlights
by
Annie Duke
Read between
November 12, 2020 - December 30, 2021
We would be better served as communicators and decision-makers if we thought less about whether we are confident in our beliefs and more about how confident we are.
What if, in addition to expressing what we believe, we also rated our level of confidence about the accuracy of our belief on a scale of zero to ten?
There is no sin in finding out there is evidence that contradicts what we believe. The only sin is in not using that evidence as objectively as possible to refine that belief going forward.
Declaring our uncertainty in our beliefs to others makes us more credible communicators.
Expressing our beliefs this way also serves our listeners. We know that our default is to believe what we hear, without vetting the information too carefully. If we communicate to our listeners that we are not 100% on what we are saying, they are less likely to walk away having been infected by our beliefs. Expressing the belief as uncertain signals to our listeners that the belief needs further vetting, that step three is still in progress.
By communicating our own uncertainty when sharing beliefs with others, we are inviting the people in our lives to act like scientists with us.
Aldous Huxley recognized, “Experience is not what happens to a man; it is what a man does with what happens to him.”
As the future unfolds into a set of outcomes, we are faced with another decision: Why did something happen the way it did? How we figure out what—if anything—we should learn from an outcome becomes another bet. As outcomes come our way, figuring out whether those outcomes were caused mainly by luck or whether they were the predictable result of particular decisions we made is a bet of great consequence. If we determine our decisions drove the outcome, we can feed the data we get following those decisions back into belief formation and updating, creating a learning loop:
Learning might proceed in a more ideal way if life were more like chess than poker. The connection between outcome quality and decision quality would be clearer because there would be less uncertainty. The challenge is that any single outcome can happen for multiple reasons. The unfolding future is a big data dump that we have to sort and interpret. And the world doesn’t connect the dots for us between outcomes and causes.
We are good at identifying the “-ER” goals we want to pursue (better, smarter, richer, healthier, whatever). But we fall short in achieving our “-ER” because of the difficulty in executing all the little decisions along the way to our goals.
Just as we are almost never 100% wrong or right, outcomes are almost never 100% due to luck or skill.
Outcomes are rarely the result of our decision quality alone or chance alone, and outcome quality is not a perfect indicator of the influence of luck or skill. When it comes to self-serving bias, we act as if our good outcomes are perfectly correlated to good skill and our bad outcomes are perfectly correlated to bad luck.*
Watching is an established learning method.
Unfortunately, learning from watching others is just as fraught with bias.
We use the same black-and-white thinking as with our own outcomes, but now we flip the script. Where we blame our own bad outcomes on bad luck, when it comes to our peers, bad outcomes are clearly their fault. While our own good outcomes are due to our awesome decision-making, when it comes to other people, good outcomes are because they got lucky. As artist and writer Jean Cocteau said, “We must believe in luck. For how else can we explain the success of those we don’t like?”
Schadenfreude is basically the opposite of compassion.
As Richard Dawkins points out, natural selection proceeds by competition among the phenotypes of genes so we literally evolved to compete, a drive that allowed our species to survive.
What accounts for most of the variance in happiness is how we’re doing comparatively.
By shifting what it is that makes us feel good about ourselves, we can move toward a more rational fielding of outcomes and a more compassionate view of others. We can learn better and be more open-minded if we work toward a positive narrative driven by engagement in truthseeking and striving toward accuracy and objectivity: giving others credit when it’s due, admitting when our decisions could have been better, and acknowledging that almost nothing is black and white.
Habits operate in a neurological loop consisting of three parts: the cue, the routine, and the reward.
the golden rule of habit change—that the best way to deal with a habit is to respect the habit loop: “To change a habit, you must keep the old cue, and deliver the old reward, but insert a new routine.”
Working with the way our brains are built in reshaping habit has a higher chance of success than working against it. Better to change the part that is more plastic: the routine of what gives us the good feeling in our narrative and the features by which we compare ourselves to others.
Instead of feeling bad when we have to admit a mistake, what if the bad feeling came from the thought that we might be missing a learning opportunity just to avoid blame?
When we look at the people performing at the highest level of their chosen field, we find that the self-serving bias that interferes with learning often recedes and even disappears. The people with the most legitimate claim to a bulletproof self-narrative have developed habits around accurate self-critique.
Keep the reward of feeling like we are doing well compared to our peers, but change the features by which we compare ourselves: be a better credit-giver than your peers, more willing than others to admit mistakes, more willing to explore possible reasons for an outcome with an open mind, even, and especially, if that might cast you in a bad light or shine a good light on someone else.
Ideally, we wouldn’t compare ourselves with others or get a good feeling when the comparison favors us.
Thinking in bets triggers a more open-minded exploration of alternative hypotheses, of reasons supporting conclusions opposite to the routine of self-serving bias. We are more likely to explore the opposite side of an argument more often and more seriously—and that will move us closer to the truth of the matter.
Perspective taking gets us closer to the truth because that truth generally lies in the middle of the way we field outcomes for ourselves and the way we field them for others. By taking someone else’s perspective, we are more likely to land in that middle ground.
Identifying a negative outcome doesn’t have the same personal sting if you turn it into a positive by finding things to learn from it.
To be sure, thinking in bets is not a miracle cure. Thinking in bets won’t make self-serving bias disappear or motivated reasoning vanish into thin air. But it will make those things better. And a little bit better is all we need to transform our lives.
The benefits of recognizing just a few extra learning opportunities compound over time. The cumulative effect of being a little better at decision-making, like compounding interest, can have huge effects in the long run on everything that we do.
Such interactions are reminders that not all situations are appropriate for truthseeking, nor are all people interested in the pursuit.
I learned from this experience that thinking in bets was easier if I had other people to help me.
Having the help of others provides many decision-making benefits, but one of the most obvious is that other people can spot our errors better than we can.
In fact, as long as there are three people in the group (two to disagree and one to referee*), the truthseeking group can be stable and productive.
Being in a group can improve our decision quality by exploring alternatives and recognizing where our thinking might be biased, but a group can also exacerbate our tendency to confirm what we already believe.
In combination, the advice of these experts in group interaction adds up to a pretty good blueprint for a truthseeking charter: A focus on accuracy (over confirmation), which includes rewarding truthseeking, objectivity, and open-mindedness within the group; Accountability, for which members have advance notice; and Openness to a diversity of ideas. An agreement along these lines creates a common bond and shared fate among members, allowing the group to produce sound reasoning.
Even better, interacting with similarly motivated people improves the ability to combat bias not just during direct interactions but when we are making and analyzing decisions on our own. The group gets into our head—in a good way—reshaping our decision habits.
We internalize the group’s approval, and, as a matter of habit, we begin to do the kind of things that would earn it when we are away from the group (which is, after all, most of the time).
Accountability is a willingness or obligation to answer for our actions or beliefs to others. A bet is a form of accountability.
“The only way in which a human being can make some approach to knowing the whole of a subject, is by hearing what can be said about it by persons of every variety of opinion, and studying all modes in which it can be looked at by every character of mind. No wise man ever acquired his wisdom in any mode but this; nor is it in the nature of human intellect to become wise in any other manner.”
Others aren’t wrapped up in preserving our narrative, anchored by our biases. It is a lot easier to have someone else offer their perspective than for you to imagine you’re another person and think about what their perspective might be.
The authors concluded that the result endorsed the importance of exposure to diverse viewpoints: “What is necessary is reasonable diversity, or diversity of reasonable views . . . and that it is important to ensure that judges, no less than anyone else, are exposed to it, and not merely through the arguments of advocates.”
At least one study has found that, yes, a betting market where scientists wager on the likelihood of experimental results replicating was more accurate than expert opinion alone.
People are more willing to offer their opinion when the goal is to win a bet rather than get along with people in a room.
If you want to pick a role model for designing a group’s practical rules of engagement, you can’t do better than Merton. To start, he coined the phrase “role model,” along with “self-fulfilling prophecy,” “reference group,” “unintended consequences,” and “focus group.” He founded the science of sociology and was the first sociologist awarded the National Medal of Science.
Not surprisingly, Merton’s paper would make an excellent career guide for anyone seeking to be a profitable bettor, or a profitable decision-maker period.
Within our own decision pod, we should strive to abide by the rule that “more is more.” Get all the information out there.
As a rule of thumb, if we have an urge to leave out a detail because it makes us uncomfortable or requires even more clarification to explain away, those are exactly the details we must share.
Even without conflicting versions, the Rashomon Effect reminds us that we can’t assume one version of a story is accurate or complete.