The Righteous Mind: Why Good People are Divided by Politics and Religion
Rate it:
Open Preview
10%
Flag icon
Everything goes ...
This highlight has been truncated due to consecutive passage length restrictions.
10%
Flag icon
These synthesizers were assisted by the rebirth of sociobiology in 1992 under a new name—evolutionary psychology.24
11%
Flag icon
mention: Patterns, Thinking, and Cognition, by Howard Margolis,
11%
Flag icon
Margolis was trying to understand why people’s beliefs about political issues are often so poorly connected to objective facts, and he hoped that cognitive science could solve the puzzle.
12%
Flag icon
I can’t call for the community to punish you simply because I don’t like what you’re doing. I have to point to something outside of my own preferences, and that pointing is our moral reasoning. We do moral reasoning not to reconstruct the actual reasons why we ourselves came to a judgment; we reason to find the best possible reasons why somebody else ought to join us in our judgment.
12%
Flag icon
The crucial distinction is really between two different kinds of cognition: intuition and reasoning.
12%
Flag icon
called these two kinds of cognition the rider (controlled processes, including “reasoning-why”) and the elephant (automatic processes, including emotion, intuition, and all forms of “seeing-that”).
13%
Flag icon
If you want to change people’s minds, you’ve got to talk to their elephants.
13%
Flag icon
“If there is any one secret of success it lies in the ability to get the other person’s point of view and see things from their angle as well as your own.”
13%
Flag icon
Empathy is an antidote to righteousness, although it’s very difficult to empathize across a moral divide.
13%
Flag icon
Hume believed that reason was (and was only fit to be) the servant of the passions.
13%
Flag icon
tried to show that Hume was right:
13%
Flag icon
The rider evolved to serve the elephant.
13%
Flag icon
The social intuitionist model starts with Hume’s model and makes it more social. Moral reasoning is part of our lifelong struggle to win friends and influence people.
13%
Flag icon
Therefore, if you want to change someone’s mind about a moral or political issue, talk to the elephant first. If you ask people to believe something that violates their intuitions, they will devote their efforts to finding an escape hatch—a reason to doubt your argument or conclusion. They will almost always succeed.
14%
Flag icon
It is easy to see the faults of others, but difficult to see one’s own faults.
14%
Flag icon
Brains evaluate everything in terms of potential threat or benefit to the self, and then adjust behavior to get more of the good stuff and less of the bad.
14%
Flag icon
Wundt said that affective reactions are so tightly integrated with perception that we find ourselves liking or disliking something the instant we notice it, sometimes even before we know what it
14%
Flag icon
The brain tags familiar things as good things.
16%
Flag icon
Babies seem to have some innate ability to process events in their physical world—the world of objects.
16%
Flag icon
infants come equipped with innate abilities to understand their social world as well.
16%
Flag icon
They understand things like harming...
This highlight has been truncated due to consecutive passage length restrictions.
16%
Flag icon
“the capacity to evaluate individuals on the basis of their social interactions is universal and unlearned.”
16%
Flag icon
by six months of age, infants are watching how people behave toward other people, and they are developing a
16%
Flag icon
preference for those who are nice rather than those who are mean.
16%
Flag icon
emotional areas of the brain are the right places to be looking for the foundations of morality,
17%
Flag icon
The rider evolved to serve the elephant, but it’s a dignified partnership, more like a lawyer serving a client than a slave serving a master.
17%
Flag icon
When does the elephant listen to reason? The main way that we change our minds on moral issues is by interacting with other people.
17%
Flag icon
But if there is affection, admiration, or a desire to please the other person, then the elephant leans toward that person and the rider tries to find the truth in the other person’s arguments.
17%
Flag icon
There are even times when we change our minds on our own, with no help from other people. Sometimes we have conflicting intuitions about something, as many people do about abortion and other controversial issues.
17%
Flag icon
finally, it is possible for people simply to reason their way to a moral conclusion that contradicts their initial intuitive judgment, although I believe this process is rare.
17%
Flag icon
But people who were forced to reflect on the good argument for two minutes actually did become substantially more tolerant toward Julie and Mark’s decision to have sex.
17%
Flag icon
Brains evaluate instantly and constantly
17%
Flag icon
Social and political judgments depend heavily on quick intuitive flashes
17%
Flag icon
Our bodily states sometimes influence our moral judgments. Bad smells and tastes can m...
This highlight has been truncated due to consecutive passage length restrictions.
18%
Flag icon
Intuitions can be shaped by reasoning, especially when reasons are embedded in a friendly conversation or an emotionally compelling novel, movie, or news story.
18%
Flag icon
Wouldn’t it have been most adaptive for our ancestors to figure out the truth, the real truth about who did what and why, rather than using all that brainpower just to find evidence in support of what they wanted to believe? That depends on which you think was more important for our ancestors’ survival: truth or reputation.
18%
Flag icon
I’ll show that Glaucon was right: people care a great deal more about appearance and reputation than about reality.
18%
Flag icon
the most important principle for designing an ethical society is to make sure that everyone’s reputation is on the line all the time, so that bad behavior will always bring bad consequences.
19%
Flag icon
But when people know in advance that they’ll have to explain themselves, they think more systematically and self-critically. They are less likely to jump to premature conclusions and more likely to revise their beliefs in response to evidence.
19%
Flag icon
Accountability increases exploratory thought only when three conditions apply: (1) decision makers learn before forming any opinion that they will be accountable to an audience, (2) the audience’s views are unknown, and (3) they believe the audience is well informed and interested in accuracy.
19%
Flag icon
Our moral thinking is much more like a politician searching for votes than a scientist searching for truth.
19%
Flag icon
People are quite good at challenging statements made by other people, but if it’s your belief, then it’s your possession—your child, almost—and you want to protect it, not challenge it and risk losing it.
20%
Flag icon
Schools don’t teach people to reason thoroughly; they select the applicants with higher IQs, and people with higher IQs are able to generate more reasons.
20%
Flag icon
“people invest their IQ in buttressing their own case rather than in exploring the entire issue more fully and evenhandedly.”
21%
Flag icon
If people can literally see what they want to see—given a bit of ambiguity—is it any wonder that scientific studies often fail to persuade the general public?
21%
Flag icon
decades of research on public opinion have led to the conclusion that self-interest is a weak predictor of policy preferences.
21%
Flag icon
people covered by insurance.35 Rather, people care about their groups, whether those be racial, regional, religious, or political.
21%
Flag icon
Our politics is groupish, not selfish.
22%
Flag icon
“skilled arguers … are not after the truth but after arguments supporting their views.”