More on this book
Community
Kindle Notes & Highlights
Thinking in bets starts with recognizing that there are exactly two things that determine how our lives turn out: the quality of our decisions and luck. Learning to recognize the difference between the two is what thinking in bets is all about.
“resulting.” When I started playing poker, more experienced players warned me about the dangers of resulting, cautioning me to resist the temptation to change my strategy just because a few hands didn’t turn out well in the short run.
Drawing an overly tight relationship between results and decision quality affects our decisions every day, potentially with far-reaching, catastrophic consequences.
It sounded like a bad result, not a bad decision.
Hindsight bias is the tendency, after an outcome is known, to see the outcome as having been inevitable.
No sober person thinks getting home safely after driving drunk reflects a good decision or good driving ability. Changing future decisions based on that lucky result is dangerous and unheard of (unless you are reasoning this out while drunk and obviously deluding yourself).
Quick or dead: our brains weren’t built for rationality
Seeking certainty helped keep us alive all this time, but it can wreak havoc on our decisions in an uncertain world.
When we work backward from results to figure out why those things happened, we are susceptible to a variety of cognitive traps, like assuming causation when there is only a correlation, or cherry-picking data to confirm the narrative we prefer.
John von Neumann
But life is more like poker. You could make the smartest, most careful decision in firing a company president and still have it blow up in your face.
You could run a red light and get through the intersection safely—or follow all the traffic rules and signals and end up in an accident. You could teach someone the rules of poker in five minutes, put them at a table with a world champion player, deal a hand (or several), and the novice could beat the champion. That could never happen in chess.
We never considered that both goblets might be poisoned. (“Inconceivable” would have been Vizzini’s term, had he been able to comment on his own death.)
When someone asks you about a coin they flipped four times, there is a correct answer: “I’m not sure.”
“I’m not sure”: using uncertainty to our advantage
Just as we have problems with resulting and hindsight bias, when we evaluate decisions solely on how they turn out, we have a mirror-image problem in making prospective decisions.
We are discouraged from saying “I don’t know” or “I’m not sure.” We regard those expressions as vague, unhelpful, and even evasive. But getting comfortable with “I’m not sure” is a vital step to being a better decision-maker. We have to make peace with not knowing.
None of that experience, however, makes it possible for a poker player to know how any given hand will turn out.
The veteran will just have a better guess.
It is often the case that our best choice doesn’t even have a particularly high likelihood of succeeding.
There are many reasons why wrapping our arms around uncertainty and giving it a big hug will help us become better decision-makers. Here are two of them. First, “I’m not sure” is simply a more accurate representation of the world. Second, and related, when we accept that we can’t be sure, we are less likely to fall into the trap of black-and-white thinking.
If we misrepresent the world at the extremes of right and wrong, with no shades of grey in between, our ability to make good choices—choices about how we are supposed to be allocating our resources, what kind of decisions we are supposed to be making, and what kind of actions we are supposed to be taking—will suffer.
The public-at-large is often guilty of making black-and-white judgments about the “success” or “failure” of probabilistic thinking.
When the UK voted to leave the European Union (“Brexit”) in July 2016, it was an unlikely result.
Just like my spectator, Dershowitz missed the point. Any prediction that is not 0% or 100% can’t be wrong solely because the most likely future doesn’t unfold.
When the 24% result happened at the final table of the charity tournament, that didn’t reflect inaccuracy about the probabilities as determined before that single outcome.
Decisions are bets on the future, and they aren’t “right” or “wrong” based on whether they turn out well on any particular iteration. An unwanted result doesn’t make our decision wrong if we thought about the alternatives and probabilities in advance and allocated our resources accordingly, as my client the CEO and Pete Carroll both did.
Maybe we made the best decision from a set of unappealing choices, none of which were likely to turn out well.
Maybe we committed our resources on a long shot because the payout more than compensated for the risk, but the long shot didn’t come in this time.
Maybe we made the best choice based on the available information, but decisive information was hidden and we...
This highlight has been truncated due to consecutive passage length restrictions.
Maybe we chose a path with a very high likelihood of succe...
This highlight has been truncated due to consecutive passage length restrictions.
Maybe there were other choices that might have been better and the one we made wasn’t wrong or ri...
This highlight has been truncated due to consecutive passage length restrictions.
The second-best choice is...
This highlight has been truncated due to consecutive passage length restrictions.
For most of our decisions, there will be a lot of space between unequivocal “right” and “wrong.”
Redefining wrong allows us to let go of all the anguish that comes from getting a bad result. But it also means we must redefine “right.” If we aren’t wrong just because things didn’t work out, then we aren’t right just because things turned out well. Do we win emotionally to making that mindset trade-off?
results. The world is structured to give us lots of opportunities to feel bad about being wrong if we want to measure ourselves by outcomes. Don’t fall for it!
Second, being wrong hurts us more than being right feels good. We know from Daniel Kahneman and Amos Tversky’s work on loss aversion, part of prospect theory (which won Kahneman the Nobel Prize in Economics in 2002), that losses in general feel about two times as bad as wins feel good. So winning $100 at blackjack feels as good to us as losing $50 feels bad to us. Because being right feels like winning and being wrong feels like losing, that means we need two favorable results for every one unfavorable result just to break even emotionally.
addition to money; we might be willing to make less money to move to a place we imagine we would like a lot better. Will the new job have better opportunities for advancement and future gains, independent of short-term gains in compensation? What are the differences in pay, benefits, security, work environment, and the kind of work we’d be doing? What are we giving up by leaving our city, colleagues, and friends for a new place? We have to inventory the potential upside and downside of taking the bet just like Hennigan did.
In most of our decisions, we are not betting against another person. Rather, we are betting against all the future versions of ourselves that we are not choosing.
Truthseeking, the desire to know the truth regardless of whether the truth aligns with the beliefs we currently hold, is not naturally supported by the way we process information. We might think of ourselves as open-minded and capable of updating our beliefs based on new information, but the research conclusively shows otherwise. Instead of altering our beliefs to fit new information, we do the opposite, altering our interpretation of that information to fit our beliefs.
Flaws in forming and updating beliefs have the potential to snowball. Once a belief is lodged, it becomes difficult to dislodge. It takes on a life of its own, leading us to notice and seek out evidence confirming our belief, rarely challenge the validity of confirming evidence, and ignore or work hard to actively discredit information contradicting the belief. This irrational, circular information-processing pattern is called motivated reasoning. The way we process new information is driven by the beliefs we hold, strengthening them. Those strengthened beliefs then drive how we process
...more
the smarter you are, the better you are at constructing a narrative that supports your beliefs, rationalizing and framing the data to fit your argument or point of view.
Chalk up an outcome to skill, and we take credit for the result. Chalk up an outcome to luck, and it wasn’t in our control.
“Self-serving bias”
Black-and-white thinking, uncolored by the reality of uncertainty, is a driver of both motivated reasoning and self-serving bias.
Unfortunately, learning from watching others is just as fraught with bias.
bias, blaming others for their bad results and failing to give them credit for their good ones is under the influence of ego.
survive. Engaging the world through the lens of competition is deeply embedded in our animal brains.
If someone we view as a peer is winning, we feel like we’re losing by comparison.