More on this book
Community
Kindle Notes & Highlights
Read between
December 4 - December 12, 2022
esprit de corps.
counterfactual thinking:
In psychology, counterfactual thinking involves imagining how the circumstances of our lives could have unfolded differently. When we realize how easily we could have held different stereotypes, we might be more willing to update our views.[*] To activate counterfactual thinking, you might ask people questions like: How would your stereotypes be different if you’d been born Black, Hispanic, Asian, or Native American?
ornery.
It’s a rare person who wants to hear what he doesn’t want to hear. —Attributed to Dick Cavett
The process of motivational interviewing involves three key techniques: Asking open-ended questions Engaging in reflective listening Affirming the person’s desire and ability to change
That was a turning point. In motivational interviewing, there’s a distinction between sustain talk and change talk. Sustain talk is commentary about maintaining the status quo. Change talk is referencing a desire, ability, need, or commitment to make adjustments. When contemplating a change, many people are ambivalent—they have some reasons to consider it but also some reasons to stay the course. Miller and Rollnick suggest asking about and listening for change talk, and then posing some questions about why and how they might change.
Listening well is more than a matter of talking less. It’s a set of skills in asking and responding. It starts with showing more interest in other people’s interests rather than trying to judge their status or prove our own. We can all get better at asking “truly curious questions that don’t have the hidden agenda of fixing, saving, advising, convincing or correcting,” journalist Kate Murphy writes, and helping to “facilitate the clear expression of another person’s thoughts.”[*]
We’re all vulnerable to the “righting reflex,” as Miller and Rollnick describe it—the desire to fix problems and offer answers. A skilled motivational interviewer resists the righting reflex—although people want a doctor to fix their broken bones, when it comes to the problems in their heads, they often want sympathy rather than solutions.
Many communicators try to make themselves look smart. Great listeners are more interested in making their audiences feel smart. They help people approach their own views with more humility, doubt, and curiosity. When people have a chance to express themselves out loud, they often discover new thoughts. As the writer E. M. Forster put it, “How can I tell what I think till I see what I say?” That understanding made Forster an unusually dedicated listener. In the words of one biographer, “To speak with him was to be seduced by an inverse charisma, a sense of being listened to with such intensity
...more
Among managers rated as the worst listeners by their employees, 94 percent of them evaluated themselves as good or very good listeners.
Listening is a way of offering others our scarcest, most precious gift: our attention.
When conflict is cliché, complexity is breaking news. —Amanda Ripley
Psychologists have a name for this: binary bias. It’s a basic human tendency to seek clarity and closure by simplifying a complex continuum into two categories. To paraphrase the humorist Robert Benchley, there are two kinds of people: those who divide the world into two kinds of people, and those who don’t.
An antidote to this proclivity is complexifying: showcasing the range of perspectives on a given topic. We might believe we’re making progress by discussing hot-button issues as two sides of a coin, but people are actually more inclined to think again if we present these topics through the many lenses of a prism. To borrow a phrase from Walt Whitman, it takes a multitude of views to help people realize that they too contain multitudes.
A dose of complexity can disrupt overconfidence cycles and spur rethinking cycles. It gives us more humility about our knowledge and more doubts about our opinions, and it can make us curious enough to discover information we were lacking. In Peter’s experiment, all it took was framing gun control not as an issue with only two extreme positions but rather as one involving many interrelated dilemmas. As journalist Amanda Ripley describes it, the gun control article “read less like a lawyer’s opening statement and more like an anthropologist’s field n...
This highlight has been truncated due to consecutive passage length restrictions.
For a long time, I struggled with how to handle politics in this book. I don’t have any silver bullets or simple bridges across a widening gulf. I don’t really even believe in political parties. As an organizational psychologist, I want to vet candidates’ leadership skills before I worry about their policy positions. As a citizen, I believe it’s my responsibility to form an independent opinion on each issue. Eventually, I decided that the best way to stay above the fray was to explore the moments that affect us all as individuals: the charged conversations we have in person and online.
A fundamental lesson of desirability bias is that our beliefs are shaped by our motivations. What we believe depends on what we want to believe.
This is binary bias in action. It presumes that the world is divided into two sides: believers and nonbelievers.
As the Committee for Skeptical Inquiry put it in a plea to the media, skepticism is “foundational to the scientific method,” whereas denial is “the a priori rejection of ideas without objective consideration.”[*]
A fascinating example is the divide around emotional intelligence. On one extreme is Daniel Goleman, who popularized the concept. He preaches that emotional intelligence matters more for performance than cognitive ability (IQ) and accounts for “nearly 90 percent” of success in leadership jobs. At the other extreme is Jordan Peterson, writing that “There is NO SUCH THING AS EQ” and prosecuting emotional intelligence as “a fraudulent concept, a fad, a convenient band-wagon, a corporate marketing scheme.”
In the moral philosophy of John Rawls, the veil of ignorance asks us to judge the justice of a society by whether we’d join it without knowing our place in it. I think the scientist’s veil of ignorance is to ask whether we’d accept the results of a study based on the methods involved, without knowing what the conclusion will be.
In a productive conversation, people treat their feelings as a rough draft. Like art, emotions are works in progress. It rarely serves us well to frame our first sketch. As we gain perspective, we revise what we feel. Sometimes we even start over from scratch.
racism is a function of our actions, not merely our intentions.
No schooling was allowed to interfere with my education. —Grant Allen
there’s a huge difference between learning about other people’s false beliefs and actually learning to unbelieve things ourselves.
This is part of a broader movement to teach kids to think like fact-checkers: the guidelines include (1) “interrogate information instead of simply consuming it,” (2) “reject rank and popularity as a proxy for reliability,” and (3) “understand that the sender of information is often not its source.”
It turns out that although perfectionists are more likely than their peers to ace school, they don’t perform any better than their colleagues at work. This tracks with evidence that, across a wide range of industries, grades are not a strong predictor of job performance.
Valedictorians aren’t likely to be the future’s visionaries,” education researcher Karen Arnold explains. “They typically settle into the system instead of shaking it up.”
The lesson was that scientists always have many options, and their frameworks are useful in some ways but arbitrary in others.
If only it weren’t for the people . . . earth would be an engineer’s paradise. —Kurt Vonnegut
Tradition (n.) Peer pressure from dead people.
She was surprised to find that the more psychological safety a team felt, the higher its error rates. It appeared that psychological safety could breed complacency. When trust runs deep in a team, people might not feel the need to question their colleagues or double-check their own work. But Edmondson soon recognized a major limitation of the data: the errors were all self-reported. To get an unbiased measure of mistakes, she sent a covert observer into the units. When she analyzed those data, the results flipped: psychologically safe teams reported more errors, but they actually made fewer
...more
Edmondson is quick to point out that psychological safety is not a matter of relaxing standards, making people comfortable, being nice and agreeable, or giving unconditional praise. It’s fostering a climate of respect, trust, and openness in which people can raise concerns and suggestions without fear of reprisal. It’s the foundation of a learning culture.
How do you know? It’s a question we need to ask more often, both of ourselves and of others. The power lies in its frankness. It’s nonjudgmental—a straightforward expression of doubt and curiosity that doesn’t put people on the defensive.
By admitting some of their imperfections out loud, managers demonstrated that they could take it—and made a public commitment to remain open to feedback. They normalized vulnerability, making their teams more comfortable opening up about their own struggles. Their employees gave more useful feedback because they knew where their managers were working to grow.
Creating psychological safety can’t be an isolated episode or a task to check off on a to-do list. When discussing their weaknesses, many of the managers in our experiment felt awkward and anxious at first. Many of their team members were surprised by that vulnerability and unsure of how to respond. Some were skeptical: they thought their managers might be fishing for compliments or cherry-picking comments that made them look good. It was only over time—as managers repeatedly demonstrated humility and curiosity—that the dynamic changed.
It takes confident humility to admit that we’re a work in progress. It shows that we care more about improving ourselves than proving ourselves.
Organizational learning should be an ongoing activity, but best practices imply it has reached an endpoint. We might be better off looking for better practices.
Process accountability might sound like the opposite of psychological safety, but they’re actually independent. Amy Edmondson finds that when psychological safety exists without accountability, people tend to stay within their comfort zone, and when there’s accountability but not safety, people tend to stay silent in an anxiety zone. When we combine the two, we create a learning zone. People feel free to experiment—and to poke holes in one another’s experiments in service of making them better. They become a challenge network. One of the most effective steps toward process accountability that
...more
Requiring proof is an enemy of progress. This is why companies like Amazon use a principle of disagree and commit. As Jeff Bezos explained it in an annual shareholder letter, instead of demanding convincing results, experiments start with asking people to make bets. “Look, I know we disagree on this but will you gamble with me on it?” The goal in a learning culture is to welcome these kinds of experiments, to make rethinking so familiar that it becomes routine.
Hayley Lewis, Sketchnote summary of A Spectrum of Reasons for Failure. From “Strategies for Learning from Failure” by Amy Edmondson, Harvard Business Review, April 2011. Illustration drawn May 2020. London, United Kingdom. Copyright © 2020 by HALO Psychology Limited.
I’ve often wondered what it would have taken to convince him to rethink his chosen line of work—and what he truly wanted out of a career.

