More on this book
Community
Kindle Notes & Highlights
Nurses who felt some hesitations about assuming the mantle were actually more effective leaders, in part because they were more likely to seek out second opinions from colleagues. They saw themselves on a level playing field, and they knew that much of what they lacked in experience and expertise they could make up by listening.
Uncertainty primes us to ask questions and absorb new ideas. It protects us against the Dunning-Kruger effect.
“Maybe impostor syndrome is needed for change. Impostors rarely say, ‘This is how we do things around here.’ They don’t say, ‘This is the right way.’ I was so eager to learn and grow that I asked everyone for advice on how I could do things differently.”
A mark of lifelong learners is recognizing that they can learn something from everyone they meet.
In a classic paper, sociologist Murray Davis argued that when ideas survive, it’s not because they’re true—it’s because they’re interesting. What makes an idea interesting is that it challenges our weakly held opinions.
When an idea or assumption doesn’t matter deeply to us, we’re often excited to question it. The natural sequence of emotions is surprise (“Really?”) followed by curiosity (“Tell me more!”) and thrill (“Whoa!”).
When a core belief is questioned, though, we tend to shut down rather than open up.
“I change my mind at a speed that drives my collaborators crazy,” he explained. “My attachment to my ideas is provisional. There’s no unconditional love for them.”
Attachment. That’s what keeps us from recognizing when our opinions are off the mark and rethinking them. To unlock the joy of being wrong, we need to detach. I’ve learned that two kinds of detachment are especially useful: detaching your present from your past and detaching your opinions from your identity.
“If you don’t look back at yourself and think, ‘Wow, how stupid I was a year ago,’ then you must not have learned much in the last year.”
The second kind of detachment is separating your opinions from your identity.
Most of us are accustomed to defining ourselves in terms of our beliefs, ideas, and ideologies. This can become a problem when it prevents us from changing our minds as the world changes and knowledge evolves.
Who you are should be a question of what you value, not what you believe. Values are your core principles in life—they
Based on his performance, Jean-Pierre might be the world’s best election forecaster. His advantage: he thinks like a scientist. He’s passionately dispassionate. At various points in his life, Jean-Pierre has changed his political ideologies and religious beliefs.
The single most important driver of forecasters’ success was how often they updated their beliefs. The best forecasters went through more rethinking cycles. They had the confident humility to doubt their judgments and the curiosity to discover new information that led them to revise their predictions.
“Accept the fact that you’re going to be wrong,” Jean-Pierre advises. “Try to disprove yourself. When you’re wrong, it’s not something to be depressed about. Say, ‘Hey, I discovered something!’”
I’ve noticed a paradox in great scientists and superforecasters: the reason they’re so comfortable being wrong is that they’re terrified of being wrong. What sets them apart is the time horizon. They’re determined to reach the correct answer in the long run, and they know that means they have to be open to stumbling, backtracking, and rerouting in the short run.
Andrew Lyne is not alone. Psychologists find that admitting we were wrong doesn’t make us look less competent. It’s a display of honesty and a willingness to learn. Although scientists believe it will damage their reputation to admit that their studies failed to replicate, the reverse is true: they’re judged more favorably if they acknowledge the new data rather than deny them.
when teams experience moderate task conflict early on, they generate more original ideas in Chinese technology companies, innovate more in Dutch delivery services, and make better decisions in American hospitals. As one research team concluded, “The absence of conflict is not harmony, it’s apathy.”
Task conflict can be constructive when it brings diversity of thought, preventing us from getting trapped in overconfidence cycles. It can help us stay humble, surface doubts, and make us curious about what we might be missing.
We learn more from people who challenge our thought process than those who affirm our conclusions. Strong leaders engage their critics and make themselves stronger.
Some organizations and occupations counter those tendencies by building challenge networks into their cultures.
Disagreeable givers often make the best critics: their intent is to elevate the work, not feed their own egos. They don’t criticize because they’re insecure; they challenge because they care. They dish out tough love.*
In fact, when I argue with someone, it’s not a display of disrespect—it’s a sign of respect. It means I value their views enough to contest them. If their opinions didn’t matter to me, I wouldn’t bother. I know I have chemistry with someone when we find it delightful to prove each other wrong.
A major problem with task conflict is that it often spills over into relationship conflict.
Experiments show that simply framing a dispute as a debate rather than as a disagreement signals that you’re receptive to considering dissenting opinions and changing your mind, which in turn motivates the other person to share more information with you.
When social scientists asked people why they favor particular policies on taxes, health care, or nuclear sanctions, they often doubled down on their convictions. Asking people to explain how those policies would work in practice—or how they’d explain them to an expert—activated a rethinking cycle. They noticed gaps in their knowledge, doubted their conclusions, and became less extreme; they were now more curious about alternative options.
Psychologists find that many of us are vulnerable to an illusion of explanatory depth. Take everyday objects like a bicycle, a piano, or earbuds: how well do you understand them? People tend to be overconfident in their knowledge: they believe they know much more than they actually do about how these objects work. We can help them see the limits of their understanding by asking them to unpack the mechanisms. How do the gears on a bike work? How does a piano key make music? How do earbuds transmit sound from your phone to your ears? People are surprised by how much they struggle to answer those
...more
As they shifted into scientist mode, they focused less on why different solutions would succeed or fail, and more on how those solutions might work.
Finally, she hit back: “You’re a logic bully!” A what? “A logic bully,” Jamie repeated. “You just overwhelmed me with rational arguments, and I don’t agree with them, but I can’t fight back.”
The skilled negotiators rarely went on offense or defense. Instead, they expressed curiosity with questions like “So you don’t see any merit in this proposal at all?”
Of every five comments the experts made, at least one ended in a question mark. They appeared less assertive, but much like in a dance, they led by letting their partners step forward.
“Arguments are often far more combative and adversarial than they need to be,” Harish told me. “You should be willing to listen to what someone else is saying and give them a lot of credit for it. It makes you sound like a reasonable person who is taking everything into account.”
Most people immediately start with a straw man, poking holes in the weakest version of the other side’s case. He does the reverse: he considers the strongest version of their case, which is known as the steel man. A politician might occasionally adopt that tactic to pander or persuade, but like a good scientist, Harish does it to learn. Instead of trying to dismantle the argument that preschool is good for kids, Harish accepted that the point was valid, which allowed him to relate to his opponent’s perspective—and to the audience’s. Then it was perfectly fair and balanced for him to express
...more
“If you have too many arguments, you’ll dilute the power of each and every one,”
It’s when audiences are skeptical of our view, have a stake in the issue, and tend to be stubborn that piling on justifications is most likely to backfire. If they’re resistant to rethinking, more reasons simply give them more ammunition to shoot our views down.
The two messages were equally effective: in both cases, 6.5 percent of the stingy alumni ended up donating. Then we combined them, because two reasons are better than one. Except they weren’t. When we put the two reasons together, the giving rate dropped below 3 percent. Each reason alone was more than twice as effective as the two combined. The audience was already skeptical. When we gave them different kinds of reasons to donate, we triggered their awareness that someone was trying to persuade them—and they shielded themselves against it. A single line of argument feels like a conversation;
...more
What did move the needle was an email with a different approach. We simply asked fans one question: are you planning to attend? Attendance climbed to 85 percent. The question gave fans the freedom to make their own case for going.
We don’t have to convince them that we’re right—we just need to open their minds to the possibility that they might be wrong. Their natural curiosity might do the rest.
Instead of attacking their beliefs with my research, I’d ask them what would open their minds to my data.
Having a conversation about the conversation shifts attention away from the substance of the disagreement and toward the process for having a dialogue.
In a heated argument, you can always stop and ask, “What evidence would change your mind?” If the answer is “nothing,” then there’s no point in continuing the debate.
If we hold an opinion weakly, expressing it strongly can backfire. Communicating it with some uncertainty signals confident humility, invites curiosity, and leads to a more nuanced discussion. Research shows that in courtrooms, expert witnesses and deliberating jurors are more credible and more persuasive when they express moderate confidence, rather than high or low confidence.
After establishing the drawbacks of her case, she emphasized a few reasons to hire her anyway: But what I do have are skills that can’t be taught. I take ownership of projects far beyond my pay grade and what is in my defined scope of responsibilities. I don’t wait for people to tell me what to do and go seek for myself what needs to be done.
Psychologist George Kelly observed that our beliefs are like pairs of reality goggles. We use them to make sense of the world and navigate our surroundings. A threat to our opinions cracks our goggles, leaving our vision blurred. It’s only natural to put up our guard in response—and Kelly noticed that we become especially hostile when trying to defend opinions that we know, deep down, are false.
When we meet group members who defy a stereotype, our first instinct isn’t to see them as exemplars and rethink the stereotype. It’s to see them as exceptions and cling to our existing beliefs.
“You’re actually rooting for the clothes,” Jerry Seinfeld quipped. “Fans will be so in love with a player, but if he goes to a different team, they boo him. This is the same human being in a different shirt; they hate him now. Boo! Different shirt! Boo!”
Regardless of whether they generated reasons to like their rivals, fans showed less hostility when they reflected on how silly the rivalry was.
Even if people aren’t on guard from the start, they’re quick to put their defenses up when their attitudes are challenged. Getting through to them requires more than just telling them that their views are arbitrary. A key step is getting them to do some counterfactual thinking: helping them consider what they’d believe if they were living in an alternative reality.

