Thinking in Bets: Making Smarter Decisions When You Don't Have All the Facts
Rate it:
Open Preview
Kindle Notes & Highlights
21%
Flag icon
the order in which we form abstract beliefs: We hear something; We believe it; Only sometimes, later, if we have the time or the inclination, we think about it and vet it, determining whether or not it is true. “Wanna bet?” triggers us to engage in that third step that we only sometimes get to. Being asked if we are willing to bet money on it makes it much more likely that we will examine our information in a less biased way, be more honest with ourselves about how sure we are of our beliefs, and be more open to updating and calibrating our beliefs.
21%
Flag icon
The more objective we are, the more accurate our beliefs become. And the person who wins bets over the long run is the one with the more accurate beliefs.
22%
Flag icon
Given that even scientific facts can have an expiration date, we would all be well-advised to take a good hard look at our beliefs, which are formed and updated in a much more haphazard way than those in science.
22%
Flag icon
When we express our beliefs (to others or just to ourselves as part of our internal decision-making dialogue), they don’t generally come with qualifications. What if, in addition to expressing what we believe, we also rated our level of confidence about the accuracy of our belief on a scale of zero to ten?
22%
Flag icon
Forcing ourselves to express how sure we are of our beliefs brings to plain sight the probabilistic nature of those beliefs, that what we believe is almost never 100% or 0% accurate but, rather, somewhere in between.
22%
Flag icon
We are less likely to succumb to motivated reasoning since it feels better to make small adjustments in degrees of certainty instead of having to grossly downgrade from “right” to “wrong.” When confronted with new evidence, it is a very different narrative to say, “I was 58% but now I’m 46%.” That doesn’t feel nearly as bad as “I thought I was right but now I’m wrong.”
23%
Flag icon
When we work toward belief calibration, we become less judgmental of ourselves.
23%
Flag icon
The fact that the person is expressing their confidence as less than 100% signals that they are trying to get at the truth, that they have considered the quantity and quality of their information with thoughtfulness and self-awareness. And thoughtful and self-aware people are more believable.
23%
Flag icon
Expressing our level of confidence also invites people to be our collaborators.
23%
Flag icon
when we declare something as 100% fact, others might be reluctant to offer up new and relevant information that would inform our beliefs for two reasons. First, they might be afraid they are wrong and so won’t speak up, worried they will be judged for that, by us or themselves. Second, even if they are very confident their information is high quality, they might be afraid of making us feel bad or judged.
23%
Flag icon
Admitting we are not sure is an invitation for help in refining our beliefs, and that will make our beliefs much more accurate over time as we are more likely to gather relevant information. Expressing our beliefs this way also serves our listeners.
23%
Flag icon
Acknowledging that decisions are bets based on our beliefs, getting comfortable with uncertainty, and redefining right and wrong are integral to a good overall approach to decision-making.
24%
Flag icon
while experience is necessary to becoming an expert, it’s not sufficient.
24%
Flag icon
As novelist and philosopher Aldous Huxley recognized, “Experience is not what happens to a man; it is what a man does with what happens to him.”
25%
Flag icon
As outcomes come our way, figuring out whether those outcomes were caused mainly by luck or whether they were the predictable result of particular decisions we made is a bet of great consequence. If we determine our decisions drove the outcome, we can feed the data we get following those decisions back into belief formation and updating, creating a learning loop:
26%
Flag icon
The way our lives turn out is the result of two things: the influence of skill and the influence of luck.
27%
Flag icon
from outcomes a pretty haphazard process. A negative outcome could be a signal to go in and examine our decision-making. That outcome could also be due to bad luck, unrelated to our decision, in which case treating that outcome as a signal to change future decisions would be a mistake.
27%
Flag icon
When we figure out why something happened, we look for a plausible reason, but one that also fits our wishes.
28%
Flag icon
100% of our bad outcomes aren’t because we got unlucky and 100% of our good outcomes aren’t because we are so awesome. Yet that is how we process the future as it unfolds.
30%
Flag icon
“We must believe in luck. For how else can we explain the success of those we don’t like?”
31%
Flag icon
We all want to feel good about ourselves in the moment, even if it’s at the expense of our long-term goals.
31%
Flag icon
Ideally, our happiness would depend on how things turn out for us regardless of how things turn out for anyone else. Yet, on a fundamental level, fielding someone’s bad outcome as their fault feels good to us. On a fundamental level, fielding someone’s good outcome as luck helps our narrative along.
31%
Flag icon
Engaging the world through the lens of competition is deeply embedded in our animal brains. It’s not enough to boost our self-image solely by our own successes. If someone we view as a peer is winning, we feel like we’re losing by comparison. We benchmark ourselves to them.
32%
Flag icon
What accounts for most of the variance in happiness is how we’re doing comparatively.
32%
Flag icon
Habits operate in a neurological loop consisting of three parts: the cue, the routine, and the reward. A habit could involve eating cookies: the cue might be hunger, the routine going to the pantry and grabbing a cookie, and the reward a sugar high.
32%
Flag icon
the best way to deal with a habit is to respect the habit loop: “To change a habit, you must keep the old cue, and deliver the old reward, but insert a new routine.”
33%
Flag icon
Our brain is built to seek positive self-image updates. It is also built to view ourselves in competition with our peers.
33%
Flag icon
Keep the reward of feeling like we are doing well compared to our peers, but change the features by which we compare ourselves: be a better credit-giver than your peers, more willing than others to admit mistakes, more willing to explore possible reasons for an outcome with an open mind, even, and especially, if that might cast you in a bad light or shine a good light on someone else. In this way we can feel that we are doing well by comparison because we are doing something unusual and hard that most people don’t do. That makes us feel exceptional.
34%
Flag icon
The key is that in explicitly recognizing that the way we field an outcome is a bet, we consider a greater number of alternative causes more seriously than we otherwise would have. That is truthseeking.
34%
Flag icon
Once we start actively training ourselves in testing alternative hypotheses and perspective taking, it becomes clear that outcomes are rarely 100% luck or 100% skill. This means that when new information comes in, we have options beyond unquestioned confirmation or reversal.
35%
Flag icon
The benefits of recognizing just a few extra learning opportunities compound over time. The cumulative effect of being a little better at decision-making, like compounding interest, can have huge effects in the long run on everything that we do.
36%
Flag icon
Such interactions are reminders that not all situations are appropriate for truthseeking, nor are all people interested in the pursuit.
38%
Flag icon
as long as there are three people in the group (two to disagree and one to referee*), the truthseeking group can be stable and productive.
38%
Flag icon
while a group can function to be better than the sum of the individuals, it doesn’t automatically turn out that way. Being in a group can improve our decision quality by exploring alternatives and recognizing where our thinking might be biased, but a group can also exacerbate our tendency to confirm what we already believe.
38%
Flag icon
“Whereas confirmatory thought involves a one-sided attempt to rationalize a particular point of view, exploratory thought involves even-handed consideration of alternative points of view.”
38%
Flag icon
confirmatory thought amplifies bias, promoting and encouraging motivated reasoning because its main purpose is justification. Confirmatory thought promotes a love and celebration of one’s own beliefs, distorting how the group processes information and works through decisions, the result of which can be groupthink. Exploratory thought, on the other hand, encourages an open-minded and objective consideration of alternative hypotheses and a tolerance of dissent to combat bias. Exploratory thought helps the members of a group reason toward a more accurate representation of the world.
39%
Flag icon
“If you put individuals together in the right way, such that some individuals can use their reasoning powers to disconfirm the claims of others, and all individuals feel some common bond or shared fate that allows them to interact civilly, you can create a group that ends up producing good reasoning as an emergent property of the social system. This is why it’s so important to have intellectual and ideological diversity within any group or institution whose goal is to find truth.”
39%
Flag icon
the advice of these experts in group interaction adds up to a pretty good blueprint for a truthseeking charter: A focus on accuracy (over confirmation), which includes rewarding truthseeking, objectivity, and open-mindedness within the group; Accountability, for which members have advance notice; and Openness to a diversity of ideas.
39%
Flag icon
I’m not trying to hurt your feelings, but if you have a question about a hand, you can ask me about strategy all day long. I just don’t think there’s much purpose in a poker story if the point is about something you had no control over, like bad luck.”
40%
Flag icon
When I started playing poker, “discussing hands” consisted mostly of my complaining about bad luck when I lost. My brother quickly got sick of my moaning. He laid down the law and said I was only allowed to ask him about hands that I had won. If I wanted him to engage with me, I had to identify some point in those hands where I might have made a mistake.
40%
Flag icon
Identifying mistakes in hands I won reinforced the separation between outcomes and decision quality.
40%
Flag icon
Once we are in a group that regularly reinforces exploratory thought, the routine becomes reflexive, running on its own. Exploratory thought becomes a new habit of mind, the new routine, and one that is self-reinforced.
40%
Flag icon
Accountability is a willingness or obligation to answer for our actions or beliefs to others. A bet is a form of accountability.
41%
Flag icon
in the moment of losing, I might not be my most rational self in assessing whether I was losing because I was getting unlucky or losing because I was playing poorly. A predetermined loss limit acts as a check against irrationally chasing losses, but self-enforcement is a problem.
41%
Flag icon
after leaving a losing game and going home, I could offset some of the sting of losing by running the conversation where my pod would approve of my decision to quit the game when I told them about it.
41%
Flag icon
When we think in bets, we run through a series of questions to examine the accuracy of our beliefs. For example: Why might my belief not be true? What other evidence might be out there bearing on my belief? Are there similar areas I can look toward to gauge whether similar beliefs to mine are true? What sources of information could I have missed or minimized on the way to reaching my belief? What are the reasons someone else could have a different belief, what’s their support, and why might they be right instead of me? What other perspectives are there as to why things turned out the way they ...more
41%
Flag icon
there is only so much we can do to answer these questions on our own. We only get exposed to the information we have been exposed to, only live the experiences we have experienced, only think of the hypotheses that we can conceive of. It’s hard to know what reasons someone else could have for believing something different. We aren’t them. We haven’t had their experiences. We don’t know what different information they have. But they do.
42%
Flag icon
A diverse group can do some of the heavy lifting of de-biasing for us.
42%
Flag icon
After September 11, the CIA created “red teams” that, according to Georgetown law professor Neal Katyal in a New York Times op-ed, “are dedicated to arguing against the intelligence community’s conventional wisdom and spotting flaws in logic and analysis.”
42%
Flag icon
Dissent channels and red teams are a beautiful implementation of Mill’s bedrock principle that we can’t know the truth of a matter without hearing the other side.