More on this book
Community
Kindle Notes & Highlights
by
Annie Duke
Read between
February 12 - March 16, 2022
those world-class poker players taught me to understand what a bet really is: a decision about an uncertain future. The implications of treating decisions as bets made it possible for me to find learning opportunities in uncertain environments.
Treating decisions as bets, I discovered, helped me avoid common decision traps, learn from results in a more rational way, and keep emotions out of the process as much as possible.
Mistakes, emotions, losing—those things are all inevitable because we are human. The approach of thinking in bets moved me toward objectivity, accuracy, and open-mindedness. That movement compounds over time to create significant changes in our lives.
Thinking in bets starts with recognizing that there are exactly two things that determine how our lives turn out: the quality of our decisions and luck. Learning to recognize the difference between the two is what thinking in bets is all about.
Carroll got unlucky. He had control over the quality of the play-call decision, but not over how it turned out. It was exactly because he didn’t get a favorable result that he took the heat.
Pete Carroll was a victim of our tendency to equate the quality of a decision with the quality of its outcome. Poker players have a word for this: “resulting.” When I started playing poker, more experienced players warned me about the dangers of resulting, cautioning me to resist the temptation to change my strategy just because a few hands didn’t turn out well in the short run.
Drawing an overly tight relationship between results and decision quality affects our decisions every day, potentially with far-reaching, catastrophic consequences.
Hindsight bias is the tendency, after an outcome is known, to see the outcome as having been inevitable. When we say, “I should have known that would happen,” or, “I should have seen it coming,” we are succumbing to hindsight bias.
our brains evolved to create certainty and order. We are uncomfortable with the idea that luck plays a significant role in our lives. We recognize the existence of luck, but we resist the idea that, despite our best efforts, things might not work out the way we want.
When we work backward from results to figure out why those things happened, we are susceptible to a variety of cognitive traps, like assuming causation when there is only a correlation, or cherry-picking data to confirm the narrative we prefer. We will pound a lot of square pegs into round holes to maintain the illusion of a tight relationship between our outcomes and our decisions.
Daniel Kahneman, in his 2011 best-selling Thinking, Fast and Slow, popularized the labels of “System 1” and “System 2.” He characterized System 1 as “fast thinking.” System 1 is what causes you to hit the brakes the instant someone jumps into the street in front of your car. It encompasses reflex, instinct, intuition, impulse, and automatic processing. System 2, “slow thinking,” is how we choose, concentrate, and expend mental energy. Kahneman explains how System 1 and System 2 are capable of dividing and conquering our decision-making but work mischief when they conflict.
the descriptive labels “reflexive mind” and “deliberative mind” favored by psychologist Gary Marcus. In his 2008 book, Kluge: The Haphazard Evolution of the Human Mind, he wrote, “Our thinking can be divided into two streams, one that is fast, automatic, and largely unconscious, and another that is slow, deliberate, and judicious.” The first system, “the reflexive system, seems to do its thing rapidly and automatically, with or without our conscious awareness.” The second system, “the deliberative system . . . deliberates, it considers, it chews over the facts.”
No one wakes up in the morning and says, “I want to be closed-minded and dismissive of others.” But what happens when we’re focused on work and a fluff-headed coworker approaches? Our brain is already using body language and curt responses to get rid of them without flouting conventions of politeness. We don’t deliberate over this; we just do it.
Our goal is to get our reflexive minds to execute on our deliberative minds’ best intentions.
Poker players, as a result, must become adept at in-the-moment decision-making or they won’t survive in the profession.
Making a living at poker requires interpolating between the deliberative and reflexive systems. The best players must find ways to harmonize otherwise irresolvable conflicts.
Solving the problem of how to execute is even more important than innate talent to succeed in poker. All the talent in the world won’t matter if a player can’t execute; avoiding common decision traps, learning from results in a rational way, and keeping emotions out of the process as much as possible.
Game theory was succinctly defined by economist Roger Myerson (one of the game-theory Nobel laureates) as “the study of mathematical models of conflict and cooperation between intelligent rational decision-makers.”
‘Chess is not a game. Chess is a well-defined form of computation. You may not be able to work out the answers, but in theory there must be a solution, a right procedure in any position. Now, real games,’ he said, ‘are not like that at all. Real life is not like that.
Poker, in contrast, is a game of incomplete information. It is a game of decision-making under conditions of uncertainty over time. (Not coincidentally, that is close to the definition of game theory.) Valuable information remains hidden. There is also an element of luck in any outcome. You could make the best possible decision at every point and still lose the hand, because you don’t know what new cards will be dealt and revealed. Once the game is finished and you try to learn from the results, separating the quality of your decisions from the influence of luck is difficult.
In chess, outcomes correlate more tightly with decision quality. In poker, it is much easier to get lucky and win, or get unlucky and lose.
If we want to improve in any game—as well as in any aspect of our lives—we have to learn from the results of our decisions.
We are discouraged from saying “I don’t know” or “I’m not sure.” We regard those expressions as vague, unhelpful, and even evasive. But getting comfortable with “I’m not sure” is a vital step to being a better decision-maker. We have to make peace with not knowing.
What makes a decision great is not that it has a great outcome. A great decision is the result of a good process, and that process must include an attempt to accurately represent our own state of knowledge. That state of knowledge, in turn, is some variation of “I’m not sure.”
The secret is to make peace with walking around in a world where we recognize that we are not sure and that’s okay. As we learn more about how our brains operate, we recognize that we don’t perceive the world objectively. But our goal should be to try.
Blaming the oddsmakers or the odds themselves assumes that once something happens, it was bound to have happened and anyone who didn’t see it coming was wrong.
Decisions are bets on the future, and they aren’t “right” or “wrong” based on whether they turn out well on any particular iteration.
Redefining wrong allows us to let go of all the anguish that comes from getting a bad result. But it also means we must redefine “right.” If we aren’t wrong just because things didn’t work out, then we aren’t right just because things turned out well.
Being right feels really good. “I was right,” “I knew it,” “I told you so”—those are all things that we say, and they all feel very good to us. Should we be willing to give up the good feeling of “right” to get rid of the anguish of “wrong”? Yes.
losses in general feel about two times as bad as wins feel good. So winning $100 at blackjack feels as good to us as losing $50 feels bad to us.
Because being right feels like winning and being wrong feels like losing, that means we need two favorable results for every one unfavorable result just to break even emotionally.
Although employers aren’t trying to entice employees to quit, their goal is similar in arriving at a compensation package to get the prospect to accept the offer and stay in the job. They must balance offering attractive pay and benefits with going too far and impairing their ability to make a profit.
By treating decisions as bets, poker players explicitly recognize that they are deciding on alternative futures, each with benefits and risks. They also recognize there are no simple answers. Some things are unknown or unknowable.
The definition of “bet” is much broader. Merriam-Webster’s Online Dictionary defines “bet” as “a choice made by thinking about what will probably happen,” “to risk losing (something) when you try to do or achieve something” and “to make decisions that are based on the belief that something will happen or is true.”
If we accept a job offer, we are also choosing to foreclose all other alternatives: we aren’t sticking with our current job, or negotiating to get a better deal in our current job, or getting or taking other offers, or changing careers, or taking some time away from work. There is always opportunity cost in choosing one path over others.
Whenever we make a parenting choice (about discipline, nutrition, school, parenting philosophy, where to live, etc.), we are betting that our choice will achieve the future we want for our children more than any other choice we might make given the constraints of the limited resources we have to allocate—our time, our money, our attention.
In most of our decisions, we are not betting against another person. Rather, we are betting against all the future versions of ourselves that we are not choosing.
This is how we think we form abstract beliefs: We hear something; We think about it and vet it, determining whether it is true or false; only after that We form our belief.
It turns out, though, that we actually form abstract beliefs this way: We hear something; We believe it to be true; Only sometimes, later, if we have the time or the inclination, we think about it and vet it, determining whether it is, in fact, true or false.
Gilbert and colleagues demonstrated through a series of experiments that our default is to believe that what we hear and read is true. Even when that information is clearly presented as being false, we are still likely to process it as true.
how we form beliefs was shaped by the evolutionary push toward efficiency rather than accuracy.
For survival-essential skills, type I errors (false positives) were less costly than type II errors (false negatives).
“nature does not start from scratch; rather, she is an inveterate jury rigger who rarely invents a new mechanism to do splendidly what an old mechanism can be modified to do tolerably well.”
Truthseeking, the desire to know the truth regardless of whether the truth aligns with the beliefs we currently hold, is not naturally supported by the way we process information. We might think of ourselves as open-minded and capable of updating our beliefs based on new information, but the research conclusively shows otherwise. Instead of altering our beliefs to fit new information, we do the opposite, altering our interpretation of that information to fit our beliefs.
A study in the 2012 Stanford Law Review called “They Saw a Protest” (the title is a homage to the original Hastorf and Cantril experiment) by Yale professor of law and psychology Dan Kahan, a leading researcher and analyst of biased reasoning, and four colleagues reinforces this notion that our beliefs drive the way we process information.
This irrational, circular information-processing pattern is called motivated reasoning. The way we process new information is driven by the beliefs we hold, strengthening them. Those strengthened beliefs then drive how we process further information, and so on.
Fake news isn’t meant to change minds. As we know, beliefs are hard to change. The potency of fake news is that it entrenches beliefs its intended audience already has, and then amplifies them.
Part of being “smart” is being good at processing information, parsing the quality of an argument and the credibility of the source. So, intuitively, it feels like smart people should have the ability to spot motivated reasoning coming and should have more intellectual resources to fight it.
the more numerate people (whether pro- or anti-gun) made more mistakes interpreting the data on the emotionally charged topic than the less numerate subjects sharing those same beliefs. “This pattern of polarization . . . does not abate among high-Numeracy subjects. Indeed, it increases.”
the better you are with numbers, the better you are at spinning those numbers to conform to and support your beliefs.