More on this book
Community
Kindle Notes & Highlights
by
Annie Duke
Read between
January 12 - March 22, 2021
Thinking in bets starts with recognizing that there are exactly two things that determine how our lives turn out: the quality of our decisions and luck.
tendency to equate the quality of a decision with the quality of its outcome. Poker players have a word for this: “resulting.”
Hindsight bias is the tendency, after an outcome is known, to see the outcome as having been inevitable.
The challenge is not to change the way our brains operate but to figure out how to work within the limitations of the brains we already have.
Resulting, assuming that our decision-making is good or bad based on a small set of outcomes, is a pretty reasonable strategy for learning in chess. But not in poker—or life.
“I’m not sure” does not mean that there is no objective truth. Firestein’s point is, in fact, that acknowledging uncertainty is the first step in executing on our goal to get closer to what is objectively true. To do this, we need to stop treating “I don’t know” and “I’m not sure” like strings of dirty words.
good poker players and good decision-makers have in common is their comfort with the world being an uncertain and unpredictable place.
In most of our decisions, we are not betting against another person. Rather, we are betting against all the future versions of ourselves that we are not choosing.
Truthseeking, the desire to know the truth regardless of whether the truth aligns with the beliefs we currently hold, is not naturally supported by the way we process information.
It doesn’t take much for any of us to believe something. And once we believe it, protecting that belief guides how we treat further information relevant to the belief.
Acknowledging uncertainty is the first step in measuring and narrowing it.
Incorporating uncertainty in the way we think about what we believe creates open-mindedness, moving us closer to a more objective stance toward information that disagrees with us.
This shifts us away from treating information that disagrees with us as a threat, as something we have to defend against, making us better able to truthseek.
There is no sin in finding out there is evidence that contradicts what we believe. The only sin is in not using that evidence as objectively as possible to refine that belief going forward.
Declaring our uncertainty in our beliefs to others makes us more credible communicators.
thoughtful and self-aware people are more believable.
By saying, “I’m 80%” and thereby communicating we aren’t sure, we open the door for others to tell us what they know.
that will make our beliefs much more accurate over time as we are more likely to gather relevant information.
Acknowledging that decisions are bets based on our beliefs, getting comfortable with uncertainty, and redefining right and wrong are integral to a good overall approach to decision-making.
(Never mind that he was compromising a vital element of his strategy of subterfuge by constantly showing and telling us he was doing this. Given that he had such an entrenched set of beliefs, it’s not surprising that he didn’t see the incongruity.)
Aldous Huxley recognized, “Experience is not what happens to a man; it is what a man does with what happens to him.”
How we figure out what—if anything—we should learn from an outcome becomes another bet.
As outcomes come our way, figuring out whether those outcomes were caused mainly by luck or whether they were the predictable result of particular decisions we made is a bet of great consequence.
The bets we make on when and how to close the feedback loop are part of the execution, all those in-the-moment decisions about whether something is a learning opportunity. To reach our long-term goals, we have to improve at sorting out when the unfolding future has something to teach us, when to close the feedback loop.
If making the same decision again would predictably result in the same outcome, or if changing the decision would predictably result in a different outcome, then the outcome following that decision was due to skill.
If, however, an outcome occurs because of things that we can’t control (like the actions of others, the weather, or our genes), the result would be due to luck. If our decisions didn’t have much impact on the way things turned out, then luck would be the main influence.
When it comes to self-serving bias, we act as if our good outcomes are perfectly correlated to good skill and our bad outcomes are perfectly correlated to bad luck.
If we put in the work to practice this routine, we can field more of our outcomes in an open-minded, more objective way, motivated by accuracy and truthseeking to drive learning. The habit of mind will change, and our decision-making will better align with executing on our long-term goals.
Thinking in bets triggers a more open-minded exploration of alternative hypotheses, of reasons supporting conclusions opposite to the routine of self-serving bias. We are more likely to explore the opposite side of an argument more often and more seriously—and that will move us closer to the truth of the matter.
The benefits of recognizing just a few extra learning opportunities compound over time. The cumulative effect of being a little better at decision-making, like compounding interest, can have huge effects in the long run on everything that we do.
groups can improve the thinking of individual decision-makers when the individuals are accountable to a group whose interest is in accuracy.
interacting with similarly motivated people improves the ability to combat bias not just during direct interactions but when we are making and analyzing decisions on our own. The group gets into our head—in a good way—reshaping our decision habits.
It is one thing to commit to rewarding ourselves for thinking in bets, but it is a lot easier if we get others to do the work of rewarding us.
groups with diverse viewpoints are the best protection against confirmatory thought.
“[n]obody has found a way to eradicate confirmation bias in individuals, but we can diversify the field to the point to where individual viewpoint biases begin to cancel out each other.”
scientists would be more accurate if they had to bet on the likelihood that results would replicate as compared to traditional peer review, which can be vulnerable to viewpoint bias.
Experts engaging in traditional peer review, providing their opinion on whether an experimental result would replicate, were right 58% of the time. A betting market in which the traders were the exact same experts and those experts
Communism (data belong to the group), Universalism (apply uniform standards to claims and evidence, regardless of where they came from), Disinterestedness (vigilance against potential conflicts that can influence the group’s evaluation), and Organized Skepticism (discussion among the group to encourage engagement and dissent).
of their present-self in the mirror allocated on average $73.90 to the retirement account. Other subjects looking in that mirror saw an age-progressed version of themselves. This latter group of subjects, on average, allocated $178.10 to the retirement account.
to change a choice likely to result in a bad outcome. Then we could embrace Thoreau’s view and harness the power of regret because it would serve a valuable purpose. It would be helpful, then, if we could get regret to do some time traveling of its own, moving before our decisions instead of after them. That way, regret might be able to keep us from making a bad bet. In addition, it wouldn’t, as Nietzsche implied, rear its head later by causing us to make a remorse-fueled second mistake.
The way we field outcomes is path dependent. It doesn’t so much matter where we end up as how we got there.
In most situations, you can’t make a precommitment that’s 100% tamper-proof. The hurdles aren’t necessarily high, but they nevertheless create a decision-interrupt that may prompt us to do the bit of time travel necessary to reduce emotion and encourage perspective and rationality in the decision.
But it turns out that better decision trees, more effective scenario planning, results from working backward rather than forward.
prospective hindsight—imagining that an event has already occurred—increases the ability to correctly identify reasons for future outcomes by 30%.”
most common form of working backward from our goal to map out the future is known as backcasting.
A premortem is an investigation into something awful, but before it happens.
people who imagine obstacles in the way of reaching their goals are more likely to achieve success,