Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts
Rate it:
Open Preview
2%
Flag icon
Most people, when directly confronted by evidence that they are wrong, do not change their point of view or plan of action but justify it even more tenaciously.
Kim liked this
3%
Flag icon
In the same way, we each draw our own moral lines and justify them. For example, have you ever done a little finessing of expenses on income taxes? That probably compensates for the legitimate expenses you forgot about, and besides, you’d be a fool not to, considering that everybody else does it. Did you fail to report some extra cash income? You’re entitled, given all the money that the government wastes on pork-barrel projects and programs you detest. Have you been writing personal e-mails and surfing the net at your office when you should have been tending to business? Those are perks of ...more
5%
Flag icon
Elliot predicted that if people go through a great deal of pain, discomfort, effort, or embarrassment to get something, they will be happier with that “something” than if it came to them easily. For behaviorists, this was a preposterous prediction. Why would people like anything associated with pain? But for Elliot, the answer was obvious: self-justification. The cognition “I am a sensible, competent person” is dissonant with the cognition “I went through a painful procedure to achieve something”—say, join a group—“that turned out to be boring and worthless.” Therefore, a person would distort ...more
6%
Flag icon
The results are always the same: severe initiations increase a member’s liking for the group.
6%
Flag icon
So powerful is the need for consonance that when people are forced to look at disconfirming evidence, they will find a way to criticize, distort, or dismiss it so that they can maintain or even strengthen their existing belief. This mental contortion is called the “confirmation bias.”8
Kim liked this
7%
Flag icon
Lenny Bruce, the legendary American humorist and social commentator, described this mechanism vividly as he watched the famous 1960 confrontation between Richard Nixon and John Kennedy in the nation’s very first televised presidential debate:   I would be with a bunch of Kennedy fans watching the debate and their comment would be, “He’s really slaughtering Nixon.” Then we would all go to another apartment, and the Nixon fans would say, “How do you like the shellacking he gave Kennedy?” And then I realized that each group loved their candidate so that a guy would have to be this blatant—he ...more
7%
Flag icon
Democrats were reducing dissonance too, but in a different way: by actually forgetting that they originally were in favor of the war. Before the invasion, about 46 percent of Democrats supported it; by 2006, only 21 percent remembered having done so. Just before the war, 72 percent of Democrats said they thought Iraq had WMDs, but later, only 26 percent remembered having believed this. To maintain consonance, they were saying, in effect, “I knew all along that Bush was lying to us.”12
7%
Flag icon
For example, in one study, people were monitored by functional magnetic resonance imaging (fMRI) as they tried to process dissonant or consonant information about George Bush or John Kerry. Drew Westen and his colleagues found that the reasoning areas of the brain virtually shut down when participants were confronted with dissonant information, and the emotion circuits of the brain were activated when consonance was restored.13
Kim liked this
7%
Flag icon
People who receive disconfirming or otherwise unwelcome information often do not simply resist it; they may come to support their original (wrong) opinion even more strongly—a backfire effect.
7%
Flag icon
It’s much easier to slot that new evidence into an existing framework and do the mental justification to keep it there than it is to change the framework.15
7%
Flag icon
In his illuminating book Stumbling on Happiness, social psychologist Dan Gilbert asks us to consider what would have happened at the end of Casablanca if Ingrid Bergman had not patriotically rejoined her Nazi-fighting husband but instead remained with Humphrey Bogart in Morocco. Would she, as Bogart tells her in a heart-wrenching speech, have regretted it—“Maybe not today, maybe not tomorrow, but soon, and for the rest of your life”? Or did she forever regret leaving Bogart? Gilbert gathered a wealth of data that shows that the answer to both questions is no, that either decision would have ...more
8%
Flag icon
The more costly a decision in terms of time, money, effort, or inconvenience, and the more irrevocable its consequences, the greater the dissonance and the greater the need to reduce it by overemphasizing the good things about the choice made.
8%
Flag icon
Therefore, when you are about to make a big purchase or an important decision—which car or computer to buy, whether to undergo plastic surgery, or whether to sign up for a costly self-help program—don’t ask someone who has just done it. That person will be highly motivated to convince you that it is the right thing to do.
8%
Flag icon
Although everyone who went through the severe initiation said that they found the hypothesis intriguing and that they could see how most people would be affected in the way I predicted, they all took pains to assure me that their preference for the group had nothing to do with the severity of the initiation.
9%
Flag icon
When you do anything that harms others—get them in trouble, verbally abuse them, or punch them out—a powerful new factor comes into play: the need to justify what you did.
10%
Flag icon
Benjamin Franklin, a serious student of human nature as well as science and politics. While serving in the Pennsylvania legislature, Franklin was disturbed by the opposition and animosity of a fellow legislator. So he set out to win him over. He didn’t do it, he wrote, by “paying any servile respect to him”—that is, by doing the other man a favor—but by inducing his target to do a favor for him. (He asked the man to loan him a rare book from his library.)
10%
Flag icon
For example, when Mrs. Keech’s doomsday predictions failed, imagine the excruciating dissonance her committed followers felt: “I am a smart person” clashed with “I just did an incredibly stupid thing: I gave away my house and possessions and quit my job because I believed a crazy woman.” To reduce that dissonance, her followers could either modify their opinion of their intelligence or justify the incredibly stupid thing they had just done. It’s not a close contest; justification wins by three lengths. Mrs. Keech’s true believers saved their self-esteem by deciding they hadn’t done anything ...more
11%
Flag icon
A used-car salesman who knows that he is dishonest does not feel dissonance when he conceals the dismal repair record of the car he is trying to unload; a woman who believes she is unlovable does not feel dissonance when a man rejects her; a con man does not experience dissonance when he cheats his grandmother out of her life savings.
Tamsen Webster
This is the same as successful people on weight watchers. Those for whom failure was consonant took it in stride. Those for whom it wasn’t found ways to explore and change.
11%
Flag icon
Dissonance reduction, therefore, will protect high self-esteem or low self-esteem, whichever is central to a person’s core self-concept.
11%
Flag icon
But the one who resisted the temptation will decide that cheating is far more immoral than he originally thought. In fact, people who cheat are disgraceful. In fact, people who cheat should be permanently expelled from school. We have to make an example of them.
Tamsen Webster
Is this why ex-smokers are so virulent?
11%
Flag icon
This process illustrates how people who have been sorely tempted, battled temptation, and almost given in to it—but resisted at the eleventh hour—come to dislike, even despise, those who did not succeed in the same effort. It’s the people who almost decide to live in glass houses who throw the first stones.
11%
Flag icon
The metaphor of the pyramid applies to most important decisions involving moral choices or life options. Instead of cheating on an exam, for example, you can substitute deciding to begin a casual affair (or not), sample an illegal drug (or not), take steroids to improve your athletic ability (or not), stay in a troubled marriage (or not), name names to the House Un-American Activities Committee (or not), lie to protect your employer and job (or not), have children (or not), pursue a demanding career (or stay home with the kids), decide that a sensational allegation against a celebrity you ...more
13%
Flag icon
Magruder said to Judge John Sirica: “I know what I have done, and Your Honor knows what I have done. Somewhere between my ambition and my ideals, I lost my ethical compass.” How do you get an honest man to lose his ethical compass? You get him to take one step at a time, and self-justification will do the rest.
13%
Flag icon
The conservative columnist William Safire once described the “psychopolitical challenge” that voters face: “how to deal with cognitive dissonance.”37 He began with a story of his own such challenge. During Bill Clinton’s administration, Safire recounted, he had criticized Hillary Clinton for trying to conceal the identity of the members of her health-care task force. He wrote a column castigating her efforts at secrecy, which he said were toxic to democracy. No dissonance there; those bad Democrats are always doing bad things. Six years later, however, he found that he was “afflicted” by ...more
13%
Flag icon
The brain is designed with blind spots, optical and psychological, and one of its cleverest tricks is to confer on its owner the comforting delusion that he or she does not have any.
13%
Flag icon
Social psychologist Lee Ross calls this phenomenon “naïve realism,” the inescapable conviction that we perceive objects and events clearly, “as they really are.”2 We assume that other reasonable people see things the same way we do. If they disagree with us, they obviously aren’t seeing clearly.
13%
Flag icon
Naïve realism creates a logical labyrinth because it presupposes two things: One, people who are open-minded and fair ought to agree with a reasonable opinion, and, two, any opinion I hold must be reasonable; if it weren’t, I wouldn’t hold it.
14%
Flag icon
Ross took peace proposals created by Israeli negotiators, labeled them as Palestinian proposals, and asked Israeli citizens to judge them. “The Israelis liked the Palestinian proposal attributed to Israel more than they liked the Israeli proposal attributed to the Palestinians,” he says. “If your own proposal isn’t going to be attractive to you when it comes from the other side, what chance is there that the other side’s proposal is going to be attractive when it actually comes from the other side?”3
14%
Flag icon
We take our own involvement in an issue as a source of accuracy and enlightenment (“I’ve felt strongly about gun control for years; therefore I know what I’m talking about”), but we regard such personal feelings on the part of others who hold different views as a source of bias (“She can’t possibly be impartial about gun control because she’s felt strongly about it for years”).
14%
Flag icon
Our innate biases are, as two legal scholars put it, “like optical illusions in two important respects—they lead us to wrong conclusions from data, and their apparent rightness persists even when we have been shown the trick.”8
14%
Flag icon
The greatest of faults, I should say, is to be conscious of none. —Thomas Carlyle, historian and essayist
15%
Flag icon
Two investigators selected 161 studies, all published during the same six-year span, of the possible risks to human health of four chemicals. Of the studies funded by industry, only 14 percent found harmful effects on health; of those funded independently, fully 60 percent found harmful effects.15 A researcher examined more than 100 controlled clinical trials designed to determine the effectiveness of a new medication over older ones. Of those favoring the traditional drug, 13 percent had been funded by drug companies and 87 percent by nonprofit institutions.16 Two Danish investigators ...more
16%
Flag icon
“Americans have witnessed an increase in hospitalizations and deaths from diseases like whooping cough, measles, mumps, and bacterial meningitis,” writes Paul Offit, chief of the Division of Infectious Diseases and director of the Vaccine Education Center at the Children’s Hospital of Philadelphia, “because some parents have become more frightened by vaccines than by the diseases they prevent.”22 We noted in chapter 1 that people often hold on to a belief long after they know rationally that it’s wrong, and this is especially true if they have taken many steps down the pyramid in support of ...more
18%
Flag icon
AMA’s Council on Ethical and Judicial Affairs designed an initiative to educate doctors about the ethical problems involved in accepting gifts from the drug industry. That initiative was funded by $590,000 in gifts from Eli Lilly and Company, GlaxoSmithKline, Pfizer, the U.S. Pharmaceutical Group, AstraZeneca Pharmaceuticals, the Bayer Corporation, Procter and Gamble, and Wyeth-Ayerst Pharmaceutical.
18%
Flag icon
Prejudices emerge from the disposition of the human mind to perceive and process information in categories. Categories is a nicer, more neutral word than stereotypes, but it’s the same thing. Cognitive psychologists view stereotypes as energy-saving devices that allow us to make efficient decisions on the basis of past experiences;
22%
Flag icon
That is why memory researchers love to quote Nietzsche: “‘I have done that,’ says my memory. ‘I cannot have done that,’ says my pride, and remains inexorable. Eventually—memory yields.”
22%
Flag icon
Moreover, recovering a memory is not at all like retrieving a file or playing a tape; it is like watching a few unconnected frames of a film and then figuring out what the rest of the scene must have been like.
22%
Flag icon
We may reproduce poetry, jokes, and other kinds of information by rote, but when we remember complex information, we shape it to fit it into a story line.
23%
Flag icon
After a while, you won’t be able to distinguish your actual memory from subsequent information that crept in from elsewhere. That phenomenon is called “source confusion,” otherwise known as the “where did I hear that?” problem.
23%
Flag icon
Memories create our stories, but our stories also create our memories. Once we have a narrative, we shape our memories to fit into it.
23%
Flag icon
In a series of experiments, Barbara Tversky and Elizabeth Marsh showed how we “spin the stories of our lives.” In one, people read a story about two roommates, both of whom did something annoying and something sociable. Then everyone was asked to write a letter about one of the roommates, either a letter of complaint to a housing authority or a letter of recommendation to a social club. As they wrote, the study participants added elaborations and details to their letters that had not been part of the original story. For example, if they were writing a recommendation, they might add, “Rachel is ...more
24%
Flag icon
Men and women alike remember having fewer sexual partners than they’ve actually had; they remember having far more sex with those partners than they actually had; and they remember using condoms more often than they actually did. People also remember voting in elections they didn’t vote in; they remember voting for the winning candidate rather than the politician they did vote for; they remember giving more to charity than they really did; they remember that their children walked and talked at an earlier age than they really did . . . You get the idea.10
26%
Flag icon
Elizabeth Loftus, a leading scientist in the field of memory, calls this process “imagination inflation,” because the more you imagine something, the more confident you become that it really happened—and the more likely you are to inflate it into an actual memory, adding details as you go.
33%
Flag icon
At its core, therefore, science is a form of arrogance control.
33%
Flag icon
For any theory to be scientific, it must be stated in such a way that it can be shown to be false as well as true.
40%
Flag icon
The alternative, that you sent an innocent man to prison for fifteen years, is so antithetical to your view of your competence that you will jump through multiple mental hoops to convince yourself that you couldn’t possibly have made such a blunder.
44%
Flag icon
As with the psychotherapists we discussed in chapter 4, training does not increase accuracy; it increases people’s confidence in their accuracy.
46%
Flag icon
Doubt is not the enemy of justice; overconfidence is.
46%
Flag icon
when investigators start looking for elements of a crime that match a suspect’s profile, they also start overlooking elements that do not match.
46%
Flag icon
Cognitive dissonance theory offers “a potent, inexpensive, and inexhaustible tool for accomplishing this goal: the officer’s own self-concept.”
« Prev 1