More on this book
Community
Kindle Notes & Highlights
by
Annie Duke
Read between
November 12, 2020 - December 30, 2021
Over time, those world-class poker players taught me to understand what a bet really is: a decision about an uncertain future. The implications of treating decisions as bets made it possible for me to find learning opportunities in uncertain environments. Treating decisions as bets, I discovered, helped me avoid common decision traps, learn from results in a more rational way, and keep emotions out of the process as much as possible.
The good news is that we can find practical work-arounds and strategies to keep us out of the traps that lie between the decisions we’d like to be making and the execution of those decisions. The promise of this book is that thinking in bets will improve decision-making throughout our lives. We can get better at separating outcome quality from decision quality, discover the power of saying, “I’m not sure,” learn strategies to map out the future, become less reactive decision-makers, build and sustain pods of fellow truthseekers to improve our decision process, and recruit our past and future
...more
Thinking in bets starts with recognizing that there are exactly two things that determine how our lives turn out: the quality of our decisions and luck. Learning to recognize the difference between the two is what thinking in bets is all about.
He made a good-quality decision that got a bad result. Pete Carroll was a victim of our tendency to equate the quality of a decision with the quality of its outcome. Poker players have a word for this: “resulting.” When I started playing poker, more experienced players warned me about the dangers of resulting, cautioning me to resist the temptation to change my strategy just because a few hands didn’t turn out well in the short run.
Why are we so bad at separating luck and skill? Why are we so uncomfortable knowing that results can be beyond our control? Why do we create such a strong connection between results and the quality of the decisions preceding them?
Drawing an overly tight relationship between results and decision quality affects our decisions every day, potentially with far-reaching, catastrophic consequences.
When I consult with executives, I sometimes start with this exercise. I ask group members to come to our first meeting with a brief description of their best and worst decisions of the previous year. I have yet to come across someone who doesn’t identify their best and worst results rather than their best and worst decisions.
Hindsight bias is the tendency, after an outcome is known, to see the outcome as having been inevitable. When we say, “I should have known that would happen,” or, “I should have seen it coming,” we are succumbing to hindsight bias.
We link results with decisions even though it is easy to point out indisputable examples where the relationship between decisions and results isn’t so perfectly correlated.
To start, our brains evolved to create certainty and order. We are uncomfortable with the idea that luck plays a significant role in our lives. We recognize the existence of luck, but we resist the idea that, despite our best efforts, things might not work out the way we want. It feels better for us to imagine the world as an orderly place, where randomness does not wreak havoc and things are perfectly predictable. We evolved to see the world that way. Creating order out of chaos has been necessary for our survival.
When we work backward from results to figure out why those things happened, we are susceptible to a variety of cognitive traps, like assuming causation when there is only a correlation, or cherry-picking data to confirm the narrative we prefer. We will pound a lot of square pegs into round holes to maintain the illusion of a tight relationship between our outcomes and our decisions.
I particularly like the descriptive labels “reflexive mind” and “deliberative mind” favored by psychologist Gary Marcus. In his 2008 book, Kluge: The Haphazard Evolution of the Human Mind, he wrote, “Our thinking can be divided into two streams, one that is fast, automatic, and largely unconscious, and another that is slow, deliberate, and judicious.” The first system, “the reflexive system, seems to do its thing rapidly and automatically, with or without our conscious awareness.” The second system, “the deliberative system . . . deliberates, it considers, it chews over the facts.”
Making more rational decisions isn’t just a matter of willpower or consciously handling more decisions in deliberative mind. Our deliberative capacity is already maxed out.
The challenge is not to change the way our brains operate but to figure out how to work within the limitations of the brains we already have.
Our goal is to get our reflexive minds to execute on our deliberative minds’ best intentions.
William Poundstone, author of a widely read book on game theory, Prisoner’s Dilemma, called it “one of the most influential and least-read books of the twentieth century.”
Trouble follows when we treat life decisions as if they were chess decisions.
Chess, for all its strategic complexity, isn’t a great model for decision-making in life, where most of our decisions involve hidden information and a much greater influence of luck.
Poker, in contrast, is a game of incomplete information. It is a game of decision-making under conditions of uncertainty over time. (Not coincidentally, that is close to the definition of game theory.) Valuable information remains hidden. There is also an element of luck in any outcome. You could make the best possible decision at every point and still lose the hand, because you don’t know what new cards will be dealt and revealed. Once the game is finished and you try to learn from the results, separating the quality of your decisions from the influence of luck is difficult.
If we want to improve in any game—as well as in any aspect of our lives—we have to learn from the results of our decisions. The quality of our lives is the sum of decision quality plus luck.
Our lives are too short to collect enough data from our own experience to make it easy to dig down into decision quality from the small set of results we experience.
When someone asks you about a coin they flipped four times, there is a correct answer: “I’m not sure.”
We are discouraged from saying “I don’t know” or “I’m not sure.” We regard those expressions as vague, unhelpful, and even evasive. But getting comfortable with “I’m not sure” is a vital step to being a better decision-maker. We have to make peace with not knowing.
What makes a decision great is not that it has a great outcome. A great decision is the result of a good process, and that process must include an attempt to accurately represent our own state of knowledge. That state of knowledge, in turn, is some variation of “I’m not sure.”
There are many reasons why wrapping our arms around uncertainty and giving it a big hug will help us become better decision-makers. Here are two of them. First, “I’m not sure” is simply a more accurate representation of the world. Second, and related, when we accept that we can’t be sure, we are less likely to fall into the trap of black-and-white thinking.
Look how quickly you can begin to redefine what it means to be wrong. Once we start thinking like this, it becomes easier to resist the temptation to make snap judgments after results or say things like “I knew it” or “I should have known.” Better decision-making and more self-compassion follow.
Any prediction that is not 0% or 100% can’t be wrong solely because the most likely future doesn’t unfold.
When we move away from a world where there are only two opposing and discrete boxes that decisions can be put in—right or wrong—we start living in the continuum between the extremes. Making better decisions stops being about wrong or right but about calibrating among all the shades of grey.
Being right feels really good. “I was right,” “I knew it,” “I told you so”—those are all things that we say, and they all feel very good to us. Should we be willing to give up the good feeling of “right” to get rid of the anguish of “wrong”? Yes.
Getting comfortable with this realignment, and all the good things that follow, starts with recognizing that you’ve been betting all along.
The promise of this book is that if we follow the example of poker players by making explicit that our decisions are bets, we can make better decisions and anticipate (and take protective measures) when irrationality is likely to keep us from acting in our best interest.
Every decision commits us to some course of action that, by definition, eliminates acting on other alternatives. Not placing a bet on something is, itself, a bet.
There is always opportunity cost in choosing one path over others.
One of the reasons we don’t naturally think of decisions as bets is because we get hung up on the zero-sum nature of the betting that occurs in the gambling world; betting against somebody else (or the casino), where the gains and losses are symmetrical. One person wins, the other loses, and the net between the two adds to zero. Betting includes, but is not limited to, those situations.
In most of our decisions, we are not betting against another person. Rather, we are betting against all the future versions of ourselves that we are not choosing.
The futures we imagine are merely possible. They haven’t happened yet. We can only make our best guess, given what we know and don’t know, at what the future will look like.
part of the skill in life comes from learning to be a better belief calibrator, using experience and information to more objectively update our beliefs to more accurately represent the world. The more accurate our beliefs, the better the foundation of the bets we make. There is also skill in identifying when our thinking patterns might lead us astray, no matter what our beliefs are, and in developing strategies to work with (and sometimes around) those thinking patterns. There are effective strategies to be more open-minded, more objective, more accurate in our beliefs, more rational in our
...more
This is how we think we form abstract beliefs: We hear something; We think about it and vet it, determining whether it is true or false; only after that We form our belief. It turns out, though, that we actually form abstract beliefs this way: We hear something; We believe it to be true; Only sometimes, later, if we have the time or the inclination, we think about it and vet it, determining whether it is, in fact, true or false.
“Findings from a multitude of research literatures converge on a single point: People are credulous creatures who find it very easy to believe and very difficult to doubt. In fact, believing is so easy, and perhaps so inevitable, that it may be more like involuntary comprehension than it is like rational assessment.”
We form beliefs without vetting most of them, and maintain them even after receiving clear, corrective information.
Truthseeking, the desire to know the truth regardless of whether the truth aligns with the beliefs we currently hold, is not naturally supported by the way we process information. We might think of ourselves as open-minded and capable of updating our beliefs based on new information, but the research conclusively shows otherwise. Instead of altering our beliefs to fit new information, we do the opposite, altering our interpretation of that information to fit our beliefs.
Whether it is a football game, a protest, or just about anything else, our pre-existing beliefs influence the way we experience the world. That those beliefs aren’t formed in a particularly orderly way leads to all sorts of mischief in our decision-making.
It doesn’t take much for any of us to believe something. And once we believe it, protecting that belief guides how we treat further information relevant to the belief.
If we think of beliefs as only 100% right or 100% wrong, when confronting new information that might contradict our belief, we have only two options: (a) make the massive shift in our opinion of ourselves from 100% right to 100% wrong, or (b) ignore or discredit the new information. It feels bad to be wrong, so we choose (b).
How we form beliefs, and our inflexibility about changing our beliefs, has serious consequences because we bet on those beliefs.
Surprisingly, being smart can actually make bias worse. Let me give you a different intuitive frame: the smarter you are, the better you are at constructing a narrative that supports your beliefs, rationalizing and framing the data to fit your argument or point of view.
It turns out the better you are with numbers, the better you are at spinning those numbers to conform to and support your beliefs.
“Wanna bet?” triggers us to engage in that third step that we only sometimes get to. Being asked if we are willing to bet money on it makes it much more likely that we will examine our information in a less biased way, be more honest with ourselves about how sure we are of our beliefs, and be more open to updating and calibrating our beliefs. The more objective we are, the more accurate our beliefs become. And the person who wins bets over the long run is the one with the more accurate beliefs.
The more we recognize that we are betting on our beliefs (with our happiness, attention, health, money, time, or some other limited resource), the more we are likely to temper our statements, getting closer to the truth as we acknowledge the risk inherent in what we believe.
We can train ourselves to view the world through the lens of “Wanna bet?” Once we start doing that, we are more likely to recognize that there is always a degree of uncertainty, that we are generally less sure than we thought we were, that practically nothing is black and white, 0% or 100%. And that’s a pretty good philosophy for living.