The Righteous Mind: Why Good People are Divided by Politics and Religion
Rate it:
Open Preview
16%
Flag icon
Psychopathy does not appear to be caused by poor mothering or early trauma, or to have any other nurture-based explanation.
16%
Flag icon
It’s a genetically heritable condition31 that creates brains that are unmoved by the needs, suffering, or dignity of others.32
16%
Flag icon
the rider’s job is to serve the elephant, not to act as a moral compass.
16%
Flag icon
Infants as young as two months old will look longer at an event that surprises them than at an event they were expecting.
16%
Flag icon
But if the infant’s mind comes already wired to interpret events in certain ways, then infants can be surprised when the world violates their expectations.
16%
Flag icon
psychologists discovered that infants are born with some knowledge of physics and mechanics: they expect that objects will move according to Newton’s laws of motion, and they get startled when psychologists show them scenes that should be physically imp...
This highlight has been truncated due to consecutive passage length restrictions.
16%
Flag icon
infants come equipped with innate abilities to understand their social world as well. They understand things like harming and helping.
16%
Flag icon
The researchers concluded that “the capacity to evaluate individuals on the basis of their social interactions is universal and unlearned.”
16%
Flag icon
the elephant begins making something like moral judgments during infancy, long before language and reasoning arrive.
16%
Flag icon
moral intuitions emerge very early and are necessary for moral development.
16%
Flag icon
“trolley dilemma,”
16%
Flag icon
Philosophers have long disagreed about whether it’s acceptable to harm one person in order to help or save several people.
16%
Flag icon
Utilitarianism is the philosophical school that says you should always aim to bring about the greatest total good, even if a few people get hurt along the way, so if there’s really no other way to save those five lives, go ahead and push.
16%
Flag icon
Other philosophers believe that we have duties to respect the rights of individuals, and we must not harm...
This highlight has been truncated due to consecutive passage length restrictions.
16%
Flag icon
goals, even moral goals such as saving lives. This view is known as deontology (from the Greek root ...
This highlight has been truncated due to consecutive passage length restrictions.
16%
Flag icon
Greene had a hunch that gut feelings were what often drove people to make deontological judgments, whereas utilitarian judgments were more cool and calculating.
17%
Flag icon
Greene wrote twenty stories that, like the trolley story, involved direct personal harm, usually done for a good reason.
17%
Flag icon
Greene also wrote twenty stories involving impersonal harm, such as a version of the trolley dilemma in which you save the five people by flipping a switch that diverts the trolley onto a side track, where it will kill just one person.
17%
Flag icon
When people read stories involving personal harm, they showed greater activity in several regions of the brain related to emotional processing.
17%
Flag icon
A slave is never supposed to question his master, but most of us can think of times when
17%
Flag icon
we questioned and revised our first intuitive judgment.
17%
Flag icon
The elephant is far more powerful than the rider, but it is not an absolute dictator.
17%
Flag icon
The main way that we change our minds on moral issues is by interacting with other people.
17%
Flag icon
When discussions are hostile, the odds of change are slight. The elephant leans away from the opponent, and the rider works frantically to rebut the opponent’s charges.
17%
Flag icon
But if there is affection, admiration, or a desire to please the other person, then the elephant leans toward that person and the rider tries to find the truth in the other person’s arguments.
17%
Flag icon
There are even times when we change our minds on our own, with no help from other people.
17%
Flag icon
And finally, it is possible for people simply to reason their way to a moral conclusion that contradicts their initial intuitive judgment, although I believe this process is rare.
17%
Flag icon
They supplied half of the subjects with a really bad argument to justify consensual incest (“If Julie and Mark make love, then there is more love in the world”). They gave the other half a stronger
17%
Flag icon
supporting argument (about how the aversion to incest is really caused by an ancient evolutionary adaptation for avoiding birth defects in a world without contraception, but because Julie and Mark use contraception, that concern is not relevant).
17%
Flag icon
The elephant leaned as soon as subjects heard the story. The rider then found a way to rebut the argument (good or bad), and subjects condemned the story equally in both cases.
17%
Flag icon
some subjects were not allowed to respond right away. The computer forced them to wait for two minutes before they could declare their judgment about Julie and Mark.
17%
Flag icon
While the subject was sitting there staring at the screen, the lean diminished and the rider had the time and freedom to think about the supporting argument.
17%
Flag icon
people who were forced to reflect on the good argument for two minutes actually did become substantially more tolerant toward Julie and Mark’s decision to have sex. The delay allowed the rider to think for himself and to decide upon a judgment that for many subjects was contrary to the elephant’s initial inclination.
17%
Flag icon
But if
17%
Flag icon
you force the two to sit around and chat for a few minutes, the elephant actually opens up to advice from the rider and arguments from outside sources. Intuitions come first, and under normal circumstances they cause us to engage in socially strategic reasoning, but there are ways to make the relationship more of a two-way street.
18%
Flag icon
The first principle of moral psychology is Intuitions come first, strategic reasoning second. In support of this principle, I reviewed six areas of experimental research demonstrating that:
18%
Flag icon
Brains evaluate instantly and constantly (as Wundt and Zajonc said). • Social and political judgments depend heavily on quick intuitive flashes (as Todorov and work with the IAT have shown). • Our bodily states sometimes influence our moral judgments. Bad smells and tastes can make people more judgmental (as can anything that makes people think about purity and cleanliness). • Psychopaths reason but don’t feel (and are severely deficient morally). • Babies feel but don’t reason (and have the beginnings of morality). • Affective reactions are in the right place at the right time in the brain ...more
18%
Flag icon
a wave of more recent...
This highlight has been truncated due to consecutive passage length restrictions.
18%
Flag icon
The elephant (automatic processes) is where most of the action is in moral psychology. Reasoning matters, of course, particularly between people, and particularly when reasons trigger new intuitions. Elephants rule, but they are neither dumb nor despotic. Intuitions can be shaped by reasoning, especially when reasons are embedded in a friendly conversation or an emotionally compelling novel, movie, or news story.48
18%
Flag icon
when we see or hear about the things other people do, the elephant begins to lean immediately. The rider, who is always trying to anticipate the elephant’s next move, begins looking around for a way to support such a move.
18%
Flag icon
Suppose the gods were to flip a coin on the day of your birth. Heads, you will be a supremely honest and fair person throughout your life, yet everyone around you will believe you’re a scoundrel. Tails, you will cheat and lie whenever it suits your needs, yet everyone around you will believe you’re a paragon of virtue.
18%
Flag icon
Glaucon asks Socrates to imagine what would happen to a man who had the mythical ring of Gyges, a gold ring that makes its wearer invisible at will:
18%
Flag icon
Glaucon’s thought experiment implies that people are only virtuous because they fear the consequences of getting caught—especially the damage to their reputations.
18%
Flag icon
a just city is one in which there is harmony, cooperation, and a division of labor between all the castes.
18%
Flag icon
But in an unjust city, one group’s gain is another’s loss, faction schemes against faction, the powerful exploit the weak, and the city is divided against itself.
18%
Flag icon
Socrates then argues that exactly these sorts of relationships apply within a just, harmonious, and happy person.
18%
Flag icon
In this chapter I’ll show that reason is not fit to rule; it was designed to seek justification, not truth.
18%
Flag icon
I’ll show that Glaucon was right: people care a great deal more about appearance and reputation than about reality.
18%
Flag icon
designing an ethical society is to make sure that everyone’s reputation is on the line all the time, so that bad behavior will always bring bad consequences.
18%
Flag icon
What, then, is the function of moral reasoning? Does it seem to have been shaped, tuned, and crafted (by natural selection) to help us find the truth, so that we can know the right way to behave and condemn those who behave wrongly? If you believe that, then you are a rationalist, like Plato, Socrates, and Kohlberg.7 Or does moral reasoning seem to have been shaped, tuned, and crafted to help us pursue socially strategic goals, such as guarding our reputations and convincing other people to support us, or our team, in disputes? If you believe that, then you are a Glauconian.
1 5 12