More on this book
Community
Kindle Notes & Highlights
When we want something to be true, he said, we ask ourselves, “Can I believe this?,” searching for an excuse to accept it. When we don’t want something to be true, we instead ask ourselves, “Must I believe this?,” searching for an excuse to reject it.
Imagine you discover a road that has a fence built across it for no particular reason you can see. You say to yourself, “Why would someone build a fence here? This seems unnecessary and stupid, let’s tear it down.” But if you don’t understand why the fence is there, Chesterton argued, you can’t be confident that it’s okay to tear it down.
Soldier mindset helps us avoid negative emotions like fear, stress, and regret.
Like Tracy, we often use soldier mindset to protect our egos by finding flattering narratives for unflattering facts.
Another mental move is to selectively focus on the features of a situation that justify optimism, while ignoring those that justify pessimism.
Note that the goal here isn’t to get other people to share your beliefs, the way it is in the case of “Persuasion.” The nihilist isn’t trying to get other people to believe in nihilism. He’s trying to get them to believe that he believes in nihilism. Just as there are fashions in clothing, so, too, are there fashions in ideas.
We use motivated reasoning not because we don’t know any better, but because we’re trying to protect things that are vitally important to us—our ability to feel good about our lives and ourselves, our motivation to try hard things and stick with them, our ability to look good and persuade, and our acceptance in our communities.
Being rationally irrational, therefore, would mean that we’re good at unconsciously choosing just enough epistemic irrationality to achieve our social and emotional goals, without impairing our judgment too much.
It’s widely known that present bias shapes our choices about how to act. What’s much less appreciated is that it also shapes our choices about how to think. Just like sleeping in, breaking your diet, or procrastinating on your work, we reap the rewards of thinking in soldier mindset right away, while the costs don’t come due until later.
Intelligence and knowledge are just tools. You can use those tools to help you see the world clearly, if that’s what you’re motivated to do. Or you can use them to defend a particular viewpoint, if you’re motivated to do that instead.
In the rest of this chapter, we’ll explore five signs of scout mindset, behavioral cues that someone cares about truth and will seek it out even when they’re not forced to, and even when the truth isn’t favorable to them.
Over the next few pages, we’ll explore five different types of thought experiments: the double standard test, the outsider test, the conformity test, the selective skeptic test, and the status quo bias test.
If you believe “I was being reasonable in that fight with my partner, and he was being unreasonable,” a hypothetical test might go something like this: Suppose another person, an objective third party, is given all of the relevant details about the fight, and is asked to judge which of you two is being more reasonable. If he judges in your favor, you win $1,000; if not, you lose $1,000. How confident do you feel that you would win that bet?
I can bet on self-driving cars, and get $10,000 if they’re on the market in a year. Alternately, I can take the “ball bet”: I’m given a box containing four balls, one of which is gray. I reach in and pull out one ball, without looking—if it’s gray, I win $10,000.*
The superforecasters changed their minds all the time. Not dramatic, 180-degree reversals every day, but subtle revisions as they learned new information.
In it, Graham pointed to the problem I described in the previous chapter and warned, “The more labels you have for yourself, the dumber they make you.”1 Inspired in part by Graham’s essay, I resolved to avoid identifying myself with any ideology, movement, or group.
What you need to be able to do is keep those identities from colonizing your thoughts and values. I call this “holding your identity lightly.”
The ideological Turing test, suggested by economist Bryan Caplan, is based on similar logic.11 It’s a way to determine if you really understand an ideology: Can you explain it as a believer would, convincingly enough that other people couldn’t tell the difference between you and a genuine believer?
Before you close this book, consider making a plan for what those incremental steps toward scout mindset might look like for you. I recommend picking a small number of scout habits to start with, no more than two or three. Here’s a list of ideas to choose from: