More on this book
Community
Kindle Notes & Highlights
asking people questions can motivate them to rethink their conclusions.
This is a common problem in persuasion: what doesn’t sway us can make our beliefs stronger. Much like a vaccine inoculates our physical immune system against a virus, the act of resistance fortifies our psychological immune system.
Refuting a point of view produces antibodies against future influence attempts. We become more certain of our opinions and less curious about alternative views. Counterarguments no longer surprise us or stump us—we have our rebuttals ready.
Instead of attacking or demeaning his clients, Miller started asking them questions and listening to their answers.
Together, they developed the core principles of a practice called motivational interviewing. The central premise is that we can rarely motivate someone else to change. We’re better off helping them find their own motivation to change.
Motivational interviewing starts with an attitude of humility and curiosity. We don’t know what might motivate someone else to change, but we’re genuinely eager to find out. The goal isn’t to tell people what to do; it’s to help them break out of overconfidence cycles and see new possibilities.
The process of motivational interviewing involves three key techniques: Asking open-ended questions Engaging in reflective listening Affirming the person’s desire and ability to change
A key turning point, she recalls, was when Arnaud “told me that whether I chose to vaccinate or not, he respected my decision as someone who wanted the best for my kids. Just that sentence—to me, it was worth all the gold in the world.”
Overall, motivational interviewing has a statistically and clinically meaningful effect on behavior change in roughly three out of four studies, and psychologists and physicians using it have a success rate of four in five. There aren’t many practical theories in the behavioral sciences with a body of evidence this robust.
To protect their freedom, instead of giving commands or offering recommendations, a motivational interviewer might say something along the lines of “Here are a few things that have helped me—do you think any of them might work for you?”
When we try to convince people to think again, our first instinct is usually to start talking. Yet the most effective way to help others open their minds is often to listen.
There’s a fourth technique of motivational interviewing, which is often recommended for the end of a conversation and for transition points: summarizing.
I was so excited that Jeff had decided to share his vision that I didn’t ask any questions about what it was—or how he would present it. I had worked with him to rethink whether and when to give a speech, but not what was in it. If I could go back, I’d ask Jeff how he was considering conveying his message and how he thought his team would receive it.
Psychologists have found that when people detect an attempt at influence, they have sophisticated defense mechanisms. The moment people feel that we’re trying to persuade them, our behavior takes on a different meaning. A straightforward question is seen as a political tactic, a reflective listening statement comes across as a prosecutor’s maneuvering, an affirmation of their ability to change sounds like a preacher’s proselytizing.
It’s common for doctors to interrupt their patients within 11 seconds, even though patients may need only 29 seconds to describe their symptoms.
“I started a dialogue,” he told me. “The aim was to build a trusting relationship. If you present information without permission, no one will listen to you.”
We now know that where complicated issues are concerned, seeing the opinions of the other side is not enough. Social media platforms have exposed us to them, but they haven’t changed our minds.
Hearing an opposing opinion doesn’t necessarily motivate you to rethink your own stance; it makes it easier for you to stick to your guns (or your gun bans). Presenting two extremes isn’t the solution; it’s part of the polarization problem. Psychologists have a name for this: binary bias.
Robert Benchley, there are two kinds of people: those who divide the world into two kinds of people, and those who don’t.
people are actually more inclined to think again if we present these topics through the many lenses of a prism.
If they read the complexified version, they made about twice as many comments about common ground as about their own views.
A fundamental lesson of desirability bias is that our beliefs are shaped by our motivations. What we believe depends on what we want to believe.
In a 2010 op-ed, he contrasted scientists with “climate deniers.” This is binary bias in action. It presumes that the world is divided into two sides: believers and nonbelievers.
To overcome binary bias, a good starting point is to become aware of the range of perspectives across a given spectrum.
And multiple experiments have shown that when experts express doubt, they become more persuasive. When someone knowledgeable admits uncertainty, it surprises people, and they end up paying more attention to the substance of the argument.
Psychologists find that people will ignore or even deny the existence of a problem if they’re not fond of the solution.
asking “how” tends to reduce polarization, setting the stage for more constructive conversations about action.
Acknowledging complexity doesn’t make speakers and writers less convincing; it makes them more credible. It doesn’t lose viewers and readers; it maintains their engagement while stoking their curiosity.
I think they’re both missing the point. Instead of arguing about whether emotional intelligence is meaningful, we should be focusing on the contingencies that explain when it’s more and less consequential.
It can help to make that respect explicit at the start of a conversation. In one experiment, if an ideological opponent merely began by acknowledging that “I have a lot of respect for people like you who stand by their principles,” people were less likely to see her as an adversary—and showed her more generosity.
Evidence shows that if false scientific beliefs aren’t addressed in elementary school, they become harder to change later.
“Learning counterintuitive scientific ideas [is] akin to becoming a fluent speaker of a second language,” psychologist Deborah Kelemen writes. It’s “a task that becomes increasingly difficult the longer it is delayed, and one that is almost never achieved with only piecemeal instruction and infrequent practice.”
Despite enjoying the lectures more, they actually gained more knowledge and skill from the active-learning session. It required more mental effort, which made it less fun but led to deeper understanding.
In the above meta-analysis, lecturing was especially ineffective in debunking known misconceptions—in leading students to think again.
And experiments have shown that when a speaker delivers an inspiring message, the audience scrutinizes the material less carefully and forgets more of the content—even while claiming to remember more of it.
Nozick created his own version of an experience machine: he insisted on teaching a new class every year. “I do my thinking through the courses I give,” he said.
“Presenting a completely polished and worked-out view doesn’t give students a feel for what it’s like to do original work in philosophy and to see it happen, to catch on to doing it,” he explained.
What I found so inspiring about Nozick’s approach was that he wasn’t content for students to learn from him. He wanted them to learn with him. Every time he tackled a new topic, he would have the opportunity to rethink his existing views on it.
it dawned on me that I could create a new assignment to teach rethinking. I assigned students to work in small groups to record their own mini-podcasts or mini–TED talks.
When students confront complex problems, they often feel confused. A teacher’s natural impulse is to rescue them as quickly as possible so they don’t feel lost or incompetent. Yet psychologists find that one of the hallmarks of an open mind is responding to confusion with curiosity and interest. One student put it eloquently: “I need time for my confusion.” Confusion can be a cue that there’s new territory to be explored or a fresh puzzle to be solved.
“Quality means rethinking, reworking, and polishing,” Ron reflects. “They need to feel they will be celebrated, not ridiculed, for going back to the drawing board.
Over the past few years, psychological safety has become a buzzword in many workplaces. Although leaders might understand its significance, they often misunderstand exactly what it is and how to create it.
It’s fostering a climate of respect, trust, and openness in which people can raise concerns and suggestions without fear of reprisal. It’s the foundation of a learning culture.
“All anybody would’ve had to ask is, ‘How do you know the drink bag leaked?’ The answer would’ve been, ‘Because somebody told us.’ That response would’ve set off red flags. It would’ve taken ten minutes to check, but nobody asked.
How do you know? It’s a question we need to ask more often, both of ourselves and of others. The power lies in its frankness. It’s nonjudgmental—a straightforward expression of doubt and curiosity that doesn’t put people on the defensive.
By admitting some of their imperfections out loud, managers demonstrated that they could take it—and made a public commitment to remain open to feedback. They normalized vulnerability, making their teams more comfortable opening up about their own struggles. Their employees gave more useful feedback because they knew where their managers were working to grow. That motivated managers to create practices to keep the door open:
Organizational learning should be an ongoing activity, but best practices imply it has reached an endpoint. We might be better off looking for better practices.
Focusing on results might be good for short-term performance, but it can be an obstacle to long-term learning.
Exclusively praising and rewarding results is dangerous because it breeds overconfidence in poor strategies, incentivizing people to keep doing things the way they’ve always done them. It isn’t until a high-stakes decision goes horribly wrong that people pause to reexamine their practices.

