More on this book
Community
Kindle Notes & Highlights
by
Annie Duke
Read between
May 19, 2019 - April 16, 2021
Over time, those world-class poker players taught me to understand what a bet really is: a decision about an uncertain future. The implications of treating decisions as bets made it possible for me to find learning opportunities in uncertain environments. Treating decisions as bets, I discovered, helped me avoid common decision traps, learn from results in a more rational way, and keep emotions out of the process as much as possible.
Thinking in bets starts with recognizing that there are exactly two things that determine how our lives turn out: the quality of our decisions and luck. Learning to recognize the difference between the two is what thinking in bets is all about.
Pete Carroll was a victim of our tendency to equate the quality of a decision with the quality of its outcome. Poker players have a word for this: “resulting.” When I started playing poker, more experienced players warned me about the dangers of resulting, cautioning me to resist the temptation to change my strategy just because a few hands didn’t turn out well in the short run.
Hindsight bias is the tendency, after an outcome is known, to see the outcome as having been inevitable.
When we work backward from results to figure out why those things happened, we are susceptible to a variety of cognitive traps, like assuming causation when there is only a correlation, or cherry-picking data to confirm the narrative we prefer. We will pound a lot of square pegs into round holes to maintain the illusion of a tight relationship between our outcomes and our decisions.
Our goal is to get our reflexive minds to execute on our deliberative minds’ best intentions.
Poker, in contrast, is a game of incomplete information. It is a game of decision-making under conditions of uncertainty over time. (Not coincidentally, that is close to the definition of game theory.) Valuable information remains hidden. There is also an element of luck in any outcome. You could make the best possible decision at every point and still lose the hand, because you don’t know what new cards will be dealt and revealed. Once the game is finished and you try to learn from the results, separating the quality of your decisions from the influence of luck is difficult.
The quality of our lives is the sum of decision quality plus luck.
“Thoroughly conscious ignorance is the prelude to every real advance in science.”
What makes a decision great is not that it has a great outcome. A great decision is the result of a good process, and that process must include an attempt to accurately represent our own state of knowledge. That state of knowledge, in turn, is some variation of “I’m not sure.”
We know from Daniel Kahneman and Amos Tversky’s work on loss aversion, part of prospect theory (which won Kahneman the Nobel Prize in Economics in 2002), that losses in general feel about two times as bad as wins feel good. So winning $100 at blackjack feels as good to us as losing $50 feels bad to us. Because being right feels like winning and being wrong feels like losing, that means we need two favorable results for every one unfavorable result just to break even emotionally.
In most of our decisions, we are not betting against another person. Rather, we are betting against all the future versions of ourselves that we are not choosing.
Ignoring the risk and uncertainty in every decision might make us feel better in the short run, but the cost to the quality of our decision-making can be immense.
We bet based on what we believe about the world.
part of the skill in life comes from learning to be a better belief calibrator, using experience and information to more objectively update our beliefs to more accurately represent the world.
It turns out, though, that we actually form abstract beliefs this way: We hear something; We believe it to be true; Only sometimes, later, if we have the time or the inclination, we think about it and vet it, determining whether it is, in fact, true or false.
“Findings from a multitude of research literatures converge on a single point: People are credulous creatures who find it very easy to believe and very difficult to doubt. In fact, believing is so easy, and perhaps so inevitable, that it may be more like involuntary comprehension than it is like rational assessment.”
As with many of our irrationalities, how we form beliefs was shaped by the evolutionary push toward efficiency rather than accuracy.
Truthseeking, the desire to know the truth regardless of whether the truth aligns with the beliefs we currently hold, is not naturally supported by the way we process information. We might think of ourselves as open-minded and capable of updating our beliefs based on new information, but the research conclusively shows otherwise. Instead of altering our beliefs to fit new information, we do the opposite, altering our interpretation of that information to fit our beliefs.
Whether it is a football game, a protest, or just about anything else, our pre-existing beliefs influence the way we experience the world.
Once a belief is lodged, it becomes difficult to dislodge. It takes on a life of its own, leading us to notice and seek out evidence confirming our belief, rarely challenge the validity of confirming evidence, and ignore or work hard to actively discredit information contradicting the belief. This irrational, circular information-processing pattern is called motivated reasoning.
He discovered that the more numerate people (whether pro- or anti-gun) made more mistakes interpreting the data on the emotionally charged topic than the less numerate subjects sharing those same beliefs.
It turns out the better you are with numbers, the better you are at spinning those numbers to conform to and support your beliefs.
Letterman’s comment was actually quite perceptive. His mistake was offering up the insight in an inappropriate forum to someone who hadn’t agreed to that kind of truthseeking exchange.
Such interactions are reminders that not all situations are appropriate for truthseeking, nor are all people interested in the pursuit. That being said, any of us who wants to get better at thinking in bets would benefit from having more David Lettermans in our lives. As the “original” Letterman learned from the awkward exchange with Lauren Conrad, Lettermanning needs agreement by both parties to be effective.
“Whereas confirmatory thought involves a one-sided attempt to rationalize a particular point of view, exploratory thought involves even-handed consideration of alternative points of view.” In other words, confirmatory thought amplifies bias, promoting and encouraging motivated reasoning because its main purpose is justification. Confirmatory thought promotes a love and celebration of one’s own beliefs, distorting how the group processes information and works through decisions, the result of which can be groupthink.
Exploratory thought, on the other hand, encourages an open-minded and objective consideration of alternative hypotheses and a tolerance of dissent to combat bias. Exploratory thought helps the members of a group reason toward a more accurate representation of the world.
“Complex and open-minded thought is most likely to be activated when decision makers learn prior to forming any opinions that they will be accountable to an audience (a) whose views are unknown, (b) who is interested in accuracy, (c) who is reasonably well-informed, and (d) who has a legitimate reason for inquiring into the reasons behind participants’ judgments/choices.”
Their 2002 paper was one of several they coauthored supporting the conclusion that groups can improve the thinking of individual decision-makers when the individuals are accountable to a group whose interest is in accuracy.
We win bets by relentlessly striving to calibrate our beliefs and predictions about the future to more accurately represent the world.
“I don’t want to hear it. I’m not trying to hurt your feelings, but if you have a question about a hand, you can ask me about strategy all day long. I just don’t think there’s much purpose in a poker story if the point is about something you had no control over, like bad luck.”
Accountability is a willingness or obligation to answer for our actions or beliefs to others. A bet is a form of accountability.
“Even research communities of highly intelligent and well-meaning individuals can fall prey to confirmation bias, as IQ is positively correlated with the number of reasons people find to support their own side in an argument.”
Experts engaging in traditional peer review, providing their opinion on whether an experimental result would replicate, were right 58% of the time. A betting market in which the traders were the exact same experts and those experts had money on the line predicted correctly 71% of the time.
If part of corporate success consists of providing the most accurate, objective, and detailed evaluation of what’s going on, employees will compete to win on those terms. That will reward better habits of mind.
Knowing how something turned out creates a conflict of interest that expresses itself as resulting.
If the group is blind to the outcome, it produces higher fidelity evaluation of decision quality. The best way to do this is to deconstruct decisions before an outcome is known.
After the outcome, make it a habit when seeking advice to give the details without revealing the outcome.
Skepticism is about approaching the world by asking why things might not be true rather than why they are true. It’s a recognition that, while there is an objective truth, everything we believe about the world is not true.
Likewise, companies can implement an anonymous dissent channel, giving any employee, from the mail room to the boardroom, a venue to express dissenting opinions, alternative strategies, novel ideas, and points of view that may disagree with the prevailing viewpoint of the company without fear of repercussions.
The group needs rules of engagement that don’t make this harder by letting members get away with being nasty or dismissive.
There are several ways to communicate to maximize our ability to engage in a truthseeking way with anyone.
First, express uncertainty.
Second, lead with assent.
In addition, when the new information is presented as supplementing rather than negating what has come before, our listeners will be much more open to what we have to say.
Third, ask for a temporary agreement to engage in truthseeking.
Finally, focus on the future.
The future, on the other hand, can always be better if we can get them to focus on things in their control.
Improving decision quality is about increasing our chances of good outcomes, not guaranteeing them.
When we make in-the-moment decisions (and don’t ponder the past or future), we are more likely to be irrational and impulsive.*