More on this book
Community
Kindle Notes & Highlights
Human beings are the world champions of cooperation beyond kinship, and we do it in large part by creating systems of formal and informal accountability.
defines accountability as the “explicit expectation that one will be
called upon to justify one’s beliefs, feelings, or actions to others,” coupled with an expectation that people will reward or punish us based on how well we justify ourselves.
when people know in advance that they’ll have to explain themselves, they think more systematically and self-critically. They are less likely to jump to premature conclusions and more likely to revise their beliefs in response to evidence.
Exploratory thought is an “evenhanded consideration of alternative points of view.” Confirmatory thought is “a one-sided attempt to rationalize a particular point of view.”13
Accountability increases exploratory thought only when three conditions apply: (1) decision makers learn before forming any opinion that they will be accountable to an audience, (2) the audience’s views are unknown, and (3) they believe the audience is well informed and interested in accuracy.
Tetlock concludes that conscious reasoning is carried out largely for the purpose of persuasion, rather than discovery.
Ed Koch, the brash mayor of New York City in the 1980s, was famous for greeting constituents with the question “How’m I doin’?” It was a humorous reversal of the usual New York “How you doin’?” but it conveyed the chronic concern of elected officials.
Research on self-esteem suggests that we are all unconsciously asking Koch’s question every day, in almost every encounter.
Leary suggested that self-esteem is more like an internal gauge, a “sociometer” that continuously measures your value as a relationship partner.
Everyone had to sit alone in a room and talk about themselves for five minutes, speaking into a microphone. At the end of each minute they saw a number flash on a screen in front of them. That number indicated how much another person listening in from another room wanted to interact with them in the next part of the study. With ratings from 1 to 7 (where 7 is best), you can imagine how it would feel to see the numbers drop while you’re talking:
4 … 3 … 2 … 3 … 2.
Not surprisingly, people who admitted that they cared about other people’s opinions had big reactions to the numbers.
the self-proclaimed mavericks suffered shocks almost as big.
Because appearing concerned about other people’s opinions makes us look weak, we (like politicians) often deny that we care about public opinion polls. But the fact is that we care a lot about what others think of us. The only people known to have no sociometer are psychopaths.17
Wason called this phenomenon the confirmation bias, the tendency to seek out and interpret new evidence in ways that confirm what you already think.
Schools don’t teach people to reason thoroughly; they select the applicants with higher IQs, and people with higher IQs are able to generate more reasons.
Perkins found that IQ was by far the biggest predictor of how well people argued, but it predicted only the number of my-side arguments.
Perkins concluded that “people invest their IQ in buttressing their own case rather than in exploring the entire issue more fully and evenhandedly.”
Being asked directly removes plausible deniability; it would take a direct lie to keep the money. As a result, people are three times more likely to be honest.
You can’t predict who will return the money based on how people rate their own honesty, or how well they are able to give the high-minded answer on a moral dilemma of the sort used by Kohlberg.
When given the opportunity, many honest people will cheat. In fact, rather than finding that a few bad apples weighted the averages, we discovered that the majority of people cheated, and that they cheated just a little bit.26
they cheated only up to the point where they themselves could no longer find a justification that would preserve their belief in their own honesty.
The bottom line is that in lab experiments that give people invisibility combined with plausible deniability, most people cheat.
When my son, Max, was three years old, I discovered that he’s allergic to must.
The word can is so much nicer: “Can you get dressed, so that we can go to school?”
The difference between can and must is the key to understanding the profound effects of self-interest on reasoning.
His simple formulation is that when we want to believe something, we ask ourselves, “Can I believe it?”
In contrast, when we don’t want to believe something, we ask ourselves, “Must I believe it?”
When subjects are told that an intelligence test gave them a low score, they choose to read articles criticizing (rather than supporting) the validity of IQ tests.
The difference between a mind asking “Must I believe it?” versus “Can I believe it?” is so profound that it even influences visual perception.
Many political scientists used to assume that people vote selfishly, choosing the candidate or policy that will benefit them the most. But decades of research on public opinion have led to the conclusion that self-interest is a weak predictor of policy preferences.
Rather, people care about their groups, whether those be racial, regional, religious, or political.
Political opinions function as “badges of social membership.”37
Liberals and conservatives actually move further apart when they read about research
does the partisan brain work as Hume says, with emotional and intuitive processes running the show and only putting in a call to reasoning when its services are needed to justify a desired conclusion? The data came out strongly supporting Hume.
All animal brains are designed to create flashes of pleasure when the animal does something important for its survival, and small pulses of the neurotransmitter dopamine in the ventral striatum (and a few other places) are where these good feelings are manufactured.
partisans escaping from handcuffs (by thinking about the final slide, which restored their confidence in their candidate) got a little hit of that dopamine.
As an intuitionist, I’d say that the worship of reason is itself an illustration of one of the most long-lived delusions in Western history: the rationalist delusion.
From Plato through Kant and Kohlberg, many rationalists have asserted that the ability to reason well about ethical issues causes good behavior. They believe that reasoning is the royal road to moral truth, and they believe that people who reason well are more likely to act morally.
expertise in moral reasoning does not seem to improve moral behavior, and it might even make it worse (perhaps by making the rider more skilled at post hoc justification).
They concluded that most of the bizarre and depressing research findings make perfect sense once you see reasoning as having evolved not to help us find truth but to help us engage in arguments, persuasion, and manipulation in the context of discussions with other people.
“skilled arguers … are not after the truth but after arguments supporting their views.”
We should not expect individuals to produce good, open-minded, truth-seeking reasoning, particularly when self-interest or reputational concerns are in play. But if you put individuals together in the right way, such that some individuals can use their reasoning powers to disconfirm the claims of others, and all individuals feel some common bond or shared fate that allows them to interact civilly, you can create a group that ends up producing good reasoning as an emergent property of the social system.
Classes are for riders, and riders are just going to use their new knowledge to serve their elephants more effectively.
If you want to make people behave more ethically, there are two ways you can go. You can change the elephant, which takes a long time and is hard to do. Or, to borrow an idea from the book Switch, by Chip Heath and Dan Heath,54 you can change the path that the elephant and rider find themselves traveling on.
The first principle of moral psychology is Intuitions come first, strategic reasoning second.
We are obsessively concerned about what others think of us, although much of the concern is unconscious and invisible to us. • Conscious reasoning functions like a press secretary who automatically justifies any position taken by the president. • With the help of our press secretary, we are able to lie and cheat often, and then cover it up so effectively that we convince even ourselves. • Reasoning can take us to almost any conclusion we want to reach, because we ask “Can I believe it?” when we want to believe something, but “Must I believe it?” when we don’t want to believe. The answer is
...more
Reasoning matters, particularly because reasons do sometimes influence other people, but most of the action in moral psychology is in the intuitions.
When I had interviewed college students on the Penn campus a month earlier, this question brought forth their moral justifications quite smoothly.

