More on this book
Community
Kindle Notes & Highlights
by
Annie Duke
Read between
June 25 - July 28, 2023
Pete Carroll was a victim of our tendency to equate the quality of a decision with the quality of its outcome. Poker players have a word for this: “resulting.”
Hindsight bias is the tendency, after an outcome is known, to see the outcome as having been inevitable.
Creating order out of chaos has been necessary for our survival.
Finding predictable connections is, literally, how our species survived.
Incorrectly interpreting rustling from the wind as an oncoming lion is called a type I error, a false positive. The consequences of such an error were much less grave than those of a type II error, a false negative. A false negative could have been fatal: hearing rustling and always assuming it’s the wind would have gotten our ancestors eaten, and we wouldn’t be here.
When we work backward from results to figure out why those things happened, we are susceptible to a variety of cognitive traps, like assuming causation when there is only a correlation, or cherry-picking data to confirm the narrative we prefer. We will pound a lot of square pegs into round holes to maintain the illusion of a tight relationship between our outcomes and our decisions.
Game theory was succinctly defined by economist Roger Myerson (one of the game-theory Nobel laureates) as “the study of mathematical models of conflict and cooperation between intelligent rational decision-makers.”
Chess contains no hidden information and very little luck.
Poker, in contrast, is a game of incomplete information. It is a game of decision-making under conditions of uncertainty over time. (Not coincidentally, that is close to the definition of game theory.) Valuable information remains hidden. There is also an element of luck in any outcome. You could make the best possible decision at every point and still lose the hand, because you don’t know what new cards will be dealt and revealed. Once the game is finished and you try to learn from the results, separating the quality of your decisions from the influence of luck is difficult.
don’t want to be the man who learns. I want to be the man who knows.”
We have to make peace with not knowing.
What makes a decision great is not that it has a great outcome. A great decision is the result of a good process, and that process must include an attempt to accurately represent our own state of knowledge. That state of knowledge, in turn, is some variation of “I’m not sure.”
What good poker players and good decision-makers have in common is their comfort with the world being an uncertain and unpredictable place.
The secret is to make peace with walking around in a world where we recognize that we are not sure and that’s okay.
Making better decisions stops being about wrong or right but about calibrating among all the shades of grey.
When the chances are known, we are tethered more tightly to a rational interpretation of the influence of luck.
Not placing a bet on something is, itself, a bet.
There is always opportunity cost in choosing one path over others.
In fact, believing is so easy, and perhaps so inevitable, that it may be more like involuntary comprehension than it is like rational assessment.”
Whether it is a football game, a protest, or just about anything else, our pre-existing beliefs influence the way we experience the world.
This irrational, circular information-processing pattern is called motivated reasoning.
Author Eli Pariser developed the term “filter bubble” in his 2011 book of the same name to describe the process of how companies like Google and Facebook use algorithms to keep pushing us in the directions we’re already headed.
The surprise is that blind-spot bias is greater the smarter you are.
Acknowledging uncertainty is the first step in measuring and narrowing it.
Declaring our uncertainty in our beliefs to others makes us more credible communicators.
100% of our bad outcomes aren’t because we got unlucky and 100% of our good outcomes aren’t because we are so awesome.
The expression “echo chamber” instantly conjures up the image of what results from our natural drift toward confirmatory thought.
We don’t win bets by being in love with our own ideas. We win bets by relentlessly striving to calibrate our beliefs and predictions about the future to more accurately represent the world.
We have all experienced situations where we get two accounts of the same event, but the versions are dramatically different because they are informed by different facts and perspectives. This is known as the Rashomon Effect,
We treat outcomes as good signals for decision quality, as if we were playing chess.
In the performance art of improvisation, the first advice is that when someone starts a scene, you should respond with “yes, and . . .” “Yes” means you are accepting the construct of the situation. “And” means you are adding to it.
If someone is off-loading emotion to us, we can ask them if they are just looking to vent or if they are looking for advice.
In addition, we can make the best possible decisions and still not get the result we want. Improving decision quality is about increasing our chances of good outcomes, not guaranteeing them.
This tendency we all have to favor our present-self at the expense of our future-self is called temporal discounting.* We are willing to take an irrationally large discount to get a reward now instead of waiting for a bigger reward later.
We can take some space till we calm down and get some perspective, recognizing that when we are on tilt we aren’t decision fit.
This action—past-us preventing present-us from doing something stupid—has become known as a Ulysses contract.
When we identify the goal and work backward from there to “remember” how we got there, the research shows that we do better.
The most common form of working backward from our goal to map out the future is known as backcasting.

