The Righteous Mind: Why Good People are Divided by Politics and Religion
Rate it:
Open Preview
18%
Flag icon
Human beings are the world champions of cooperation beyond kinship, and we do it in large part by creating systems of formal and informal accountability.
19%
Flag icon
defines accountability as the “explicit expectation that one will be
19%
Flag icon
called upon to justify one’s beliefs, feelings, or actions to others,” coupled with an expectation that people will reward or punish us based on how well we justify ourselves.
19%
Flag icon
when people know in advance that they’ll have to explain themselves, they think more systematically and self-critically. They are less likely to jump to premature conclusions and more likely to revise their beliefs in response to evidence.
19%
Flag icon
Exploratory thought is an “evenhanded consideration of alternative points of view.” Confirmatory thought is “a one-sided attempt to rationalize a particular point of view.”13
19%
Flag icon
Accountability increases exploratory thought only when three conditions apply: (1) decision makers learn before forming any opinion that they will be accountable to an audience, (2) the audience’s views are unknown, and (3) they believe the audience is well informed and interested in accuracy.
19%
Flag icon
Tetlock concludes that conscious reasoning is carried out largely for the purpose of persuasion, rather than discovery.
19%
Flag icon
Ed Koch, the brash mayor of New York City in the 1980s, was famous for greeting constituents with the question “How’m I doin’?” It was a humorous reversal of the usual New York “How you doin’?” but it conveyed the chronic concern of elected officials.
19%
Flag icon
Research on self-esteem suggests that we are all unconsciously asking Koch’s question every day, in almost every encounter.
19%
Flag icon
Leary suggested that self-esteem is more like an internal gauge, a “sociometer” that continuously measures your value as a relationship partner.
19%
Flag icon
Everyone had to sit alone in a room and talk about themselves for five minutes, speaking into a microphone. At the end of each minute they saw a number flash on a screen in front of them. That number indicated how much another person listening in from another room wanted to interact with them in the next part of the study. With ratings from 1 to 7 (where 7 is best), you can imagine how it would feel to see the numbers drop while you’re talking:
19%
Flag icon
4 … 3 … 2 … 3 … 2.
19%
Flag icon
Not surprisingly, people who admitted that they cared about other people’s opinions had big reactions to the numbers.
19%
Flag icon
the self-proclaimed mavericks suffered shocks almost as big.
19%
Flag icon
Because appearing concerned about other people’s opinions makes us look weak, we (like politicians) often deny that we care about public opinion polls. But the fact is that we care a lot about what others think of us. The only people known to have no sociometer are psychopaths.17
20%
Flag icon
Wason called this phenomenon the confirmation bias, the tendency to seek out and interpret new evidence in ways that confirm what you already think.
20%
Flag icon
Schools don’t teach people to reason thoroughly; they select the applicants with higher IQs, and people with higher IQs are able to generate more reasons.
20%
Flag icon
Perkins found that IQ was by far the biggest predictor of how well people argued, but it predicted only the number of my-side arguments.
20%
Flag icon
Perkins concluded that “people invest their IQ in buttressing their own case rather than in exploring the entire issue more fully and evenhandedly.”
20%
Flag icon
Being asked directly removes plausible deniability; it would take a direct lie to keep the money. As a result, people are three times more likely to be honest.
20%
Flag icon
You can’t predict who will return the money based on how people rate their own honesty, or how well they are able to give the high-minded answer on a moral dilemma of the sort used by Kohlberg.
20%
Flag icon
When given the opportunity, many honest people will cheat. In fact, rather than finding that a few bad apples weighted the averages, we discovered that the majority of people cheated, and that they cheated just a little bit.26
20%
Flag icon
they cheated only up to the point where they themselves could no longer find a justification that would preserve their belief in their own honesty.
21%
Flag icon
The bottom line is that in lab experiments that give people invisibility combined with plausible deniability, most people cheat.
21%
Flag icon
When my son, Max, was three years old, I discovered that he’s allergic to must.
21%
Flag icon
The word can is so much nicer: “Can you get dressed, so that we can go to school?”
21%
Flag icon
The difference between can and must is the key to understanding the profound effects of self-interest on reasoning.
21%
Flag icon
His simple formulation is that when we want to believe something, we ask ourselves, “Can I believe it?”
21%
Flag icon
In contrast, when we don’t want to believe something, we ask ourselves, “Must I believe it?”
21%
Flag icon
When subjects are told that an intelligence test gave them a low score, they choose to read articles criticizing (rather than supporting) the validity of IQ tests.
21%
Flag icon
The difference between a mind asking “Must I believe it?” versus “Can I believe it?” is so profound that it even influences visual perception.
21%
Flag icon
Many political scientists used to assume that people vote selfishly, choosing the candidate or policy that will benefit them the most. But decades of research on public opinion have led to the conclusion that self-interest is a weak predictor of policy preferences.
21%
Flag icon
Rather, people care about their groups, whether those be racial, regional, religious, or political.
21%
Flag icon
Political opinions function as “badges of social membership.”37
21%
Flag icon
Liberals and conservatives actually move further apart when they read about research
22%
Flag icon
does the partisan brain work as Hume says, with emotional and intuitive processes running the show and only putting in a call to reasoning when its services are needed to justify a desired conclusion? The data came out strongly supporting Hume.
22%
Flag icon
All animal brains are designed to create flashes of pleasure when the animal does something important for its survival, and small pulses of the neurotransmitter dopamine in the ventral striatum (and a few other places) are where these good feelings are manufactured.
22%
Flag icon
partisans escaping from handcuffs (by thinking about the final slide, which restored their confidence in their candidate) got a little hit of that dopamine.
22%
Flag icon
As an intuitionist, I’d say that the worship of reason is itself an illustration of one of the most long-lived delusions in Western history: the rationalist delusion.
22%
Flag icon
From Plato through Kant and Kohlberg, many rationalists have asserted that the ability to reason well about ethical issues causes good behavior. They believe that reasoning is the royal road to moral truth, and they believe that people who reason well are more likely to act morally.
22%
Flag icon
expertise in moral reasoning does not seem to improve moral behavior, and it might even make it worse (perhaps by making the rider more skilled at post hoc justification).
22%
Flag icon
They concluded that most of the bizarre and depressing research findings make perfect sense once you see reasoning as having evolved not to help us find truth but to help us engage in arguments, persuasion, and manipulation in the context of discussions with other people.
22%
Flag icon
“skilled arguers … are not after the truth but after arguments supporting their views.”
22%
Flag icon
We should not expect individuals to produce good, open-minded, truth-seeking reasoning, particularly when self-interest or reputational concerns are in play. But if you put individuals together in the right way, such that some individuals can use their reasoning powers to disconfirm the claims of others, and all individuals feel some common bond or shared fate that allows them to interact civilly, you can create a group that ends up producing good reasoning as an emergent property of the social system.
22%
Flag icon
Classes are for riders, and riders are just going to use their new knowledge to serve their elephants more effectively.
22%
Flag icon
If you want to make people behave more ethically, there are two ways you can go. You can change the elephant, which takes a long time and is hard to do. Or, to borrow an idea from the book Switch, by Chip Heath and Dan Heath,54 you can change the path that the elephant and rider find themselves traveling on.
22%
Flag icon
The first principle of moral psychology is Intuitions come first, strategic reasoning second.
23%
Flag icon
We are obsessively concerned about what others think of us, although much of the concern is unconscious and invisible to us. • Conscious reasoning functions like a press secretary who automatically justifies any position taken by the president. • With the help of our press secretary, we are able to lie and cheat often, and then cover it up so effectively that we convince even ourselves. • Reasoning can take us to almost any conclusion we want to reach, because we ask “Can I believe it?” when we want to believe something, but “Must I believe it?” when we don’t want to believe. The answer is ...more
23%
Flag icon
Reasoning matters, particularly because reasons do sometimes influence other people, but most of the action in moral psychology is in the intuitions.
23%
Flag icon
When I had interviewed college students on the Penn campus a month earlier, this question brought forth their moral justifications quite smoothly.
1 6 12