More on this book
Community
Kindle Notes & Highlights
by
Carol Tavris
Read between
December 21, 2020 - January 14, 2021
As fallible human beings, all of us share the impulse to justify ourselves and avoid taking responsibility for actions that turn out to be harmful, immoral, or stupid.
It goes further than that. Most people, when directly confronted by evidence that they are wrong, do not change their point of view or plan of action but justify it even more tenaciously.
Now, between the conscious lie to fool others and unconscious self-justification to fool ourselves, there’s a fascinating gray area patrolled by an unreliable, self-serving historian—memory. Memories are often pruned and shaped with an ego-enhancing bias that blurs the edges of past events, softens culpability, and distorts what really happened.
None of us can avoid making blunders. But we do have the ability to say, “This is not working out here. This is not making sense.” To err is human, but humans then have a choice between covering up and fessing up. The choice we make is crucial to what we do next. We are forever being told that we should learn from our mistakes, but how can we learn unless we first admit that we made those mistakes?
The engine that drives self-justification, the energy that produces the need to justify our actions and decisions—especially the wrong ones—is the unpleasant feeling that Festinger called “cognitive dissonance.”
These findings do not mean that people enjoy painful experiences or that they enjoy things because they are associated with pain. What they mean is that if a person voluntarily goes through a difficult or painful experience in order to attain some goal or object, that goal or object becomes more attractive.
On the contrary; if new information is consonant with our beliefs, we think it is well founded and useful—“Just what I always said!” But if the new information is dissonant, then we consider it biased or foolish—“What a dumb argument!” So powerful is the need for consonance that when people are forced to look at disconfirming evidence, they will find a way to criticize, distort, or dismiss it so that they can maintain or even strengthen their existing belief. This mental contortion is called the “confirmation bias.”8
The confirmation bias is especially glaring in matters of political observation; we see only the positive attributes of our side and the negative attributes of theirs.
In one study, people were monitored by magnetic resonance imaging (MRI) as they tried to process either dissonant or consonant information about George Bush or John Kerry. Drew Westen and his colleagues found that the reasoning areas of the brain virtually shut down when participants were confronted with dissonant information, and the emotion circuits of the brain were activated when consonance was restored.
In this study, the researchers simply intercepted people who were standing in line to place two-dollar bets and other people who had just left the window. The investigators asked them how certain they were that their horses would win. The bettors who had placed their bets were far more certain about their choice than the folks waiting in line.19 Yet nothing had changed except the finality of placing the bet. People become more certain they are right about something they just did if they can’t undo it.
People want to believe that, being smart and rational individuals, they know why they make the choices they do, so they are not always happy when you tell them the actual reason for their actions.
Actually, decades of experimental research have found exactly the opposite: when people vent their feelings aggressively, they often feel worse, pump up their blood pressure, and make themselves even angrier.
It’s the people who almost decide to live in glass houses who throw the first stones.
How do you get an honest man to lose his ethical compass? You get him to take one step at a time, and self-justification will do the rest.
As we will see, his willingness to concede that his own side made a mistake is something that few are prepared to do. Instead, conservatives and liberals alike will bend over backward to reduce dissonance in a way that is favorable to them and their team. The specific tactics vary, but our efforts at self-justification are all designed to serve our need to feel good about what we have done, what we believe, and who we are.
Along with the confirmation bias, the brain comes packaged with other self-serving habits that allow us to justify our own perceptions and beliefs as being accurate, realistic, and unbiased. Social psychologist Lee Ross named this phenomenon “naive realism,” the inescapable conviction that we perceive objects and events clearly, “as they really are.”
We assume that other reasonable people see things the same way we do. If they disagree with us, they obviously aren’t seeing clearly.
In one experiment, Ross took peace proposals created by Israeli negotiators, labeled them as Palestinian proposals, and asked Israeli citizens to judge them. “The Israelis liked the Palestinian proposal attributed to Israel more than they liked the Israeli proposal attributed to the Palestinians,” he says. “If your own proposal isn’t going to be attractive to you when it comes from the other side, what chance is there that the other side’s proposal is going to be attractive when it actually comes from the other side?”
We take our own involvement in an issue as a source of accuracy and enlightenment (“I’ve felt strongly about gun control for years, therefore I know what I’m talking about”), but we regard such personal feelings on the part of others who hold different views as a source of bias (“She can’t possibly be impartial about gun control because she’s felt strongly about it for years”).
Our innate biases are, as two legal scholars put it, “like optical illusions in two important respects—they lead us to wrong conclusions from data, and their apparent rightness persists even when we have been shown the trick.”
Prejudices emerge from the disposition of the human mind to perceive and process information in categories. Categories is a nicer, more neutral word than stereotypes, but it’s the same thing.
Cognitive psychologists view stereotypes as energy-saving devices that allow us to make efficient decisions on the basis of past experiences; they help us quickly process new information, retrieve memories, understand real differences between groups, and predict, often with considerable accuracy, how others will behave or think.
lives. That’s the upside. The downside is that stereotypes flatten out differences within the category we are looking at and exaggerate differences between categories.
Us is the most fundamental social category in the brain’s organizing system, and the concept is hardwired.
Without feeling attached to groups that give our lives meaning, identity, and purpose, we would suffer the intolerable sensation that we were loose marbles rattling around in a random universe. Therefore, we will do what it takes to preserve these attachments.
The line “But some of my best friends are [X],” well deserving of the taunts it now gets, has persisted because it is such an efficient way of resolving the dissonance created when a prejudice runs headlong into an exception.
Participants successfully control their negative feelings under normal conditions, but as soon as they become angry or frustrated or when their self-esteem wobbles, they express their prejudice directly because now they can justify it: “I’m not a bad or prejudiced person, but, hey—he insulted me!”
Our greatest hope of self-correction lies in making sure we are not operating in a hall of mirrors in which all we see are distorted reflections of our own desires and convictions. We need a few trusted naysayers in our lives, critics who are willing to puncture our protective bubble of self-justifications and yank us back to reality if we veer too far off.
That is why memory researchers love to quote Nietzsche: “ ‘I have done that,’ says my memory. ‘I cannot have done that,’ says my pride, and remains inexorable. Eventually—memory yields.”
Memories are not buried somewhere in the brain like bones at an archaeological site; you can’t dig them up, perfectly preserved. We do not remember everything that happens to us; we select only highlights. If we didn’t forget, our minds could not work efficiently, because they would be cluttered with mental junk—the temperature last Wednesday, a boring conversation on the bus, the price of peaches at the market yesterday.
The public’s typical impulse is to take one party’s side and conclude that the other side is lying. But an understanding of memory and self-justification leads us to a more nuanced perspective: a person doesn’t have to be lying to be wrong.
Sex researchers repeatedly find that many people rarely say what they mean at the start of a sexual encounter, and they often don’t mean what they say. They find it difficult to say what they dislike because they don’t want to hurt the other person’s feelings. They may think they want intercourse and then change their minds. They may think they don’t want intercourse and change their minds. They are, in short, engaging in what social psychologist Deborah Davis calls a “dance of ambiguity.”
This dance of ambiguity benefits both partners; through vagueness and indirection, each party’s ego is protected in case the other says no. Indirection saves a lot of hurt feelings, but it also causes problems: the woman really thinks the man should have known she wanted him to stop, and he really thinks she gave consent.
Both parties believe they are telling the truth, but one or both may be wrong because of the unreliability of memory—which is reconstructive in nature and exquisitely susceptible to suggestion—and because both are motivated to justify their actions.
As a result, the woman might falsely remember saying things that she thought about saying but did not say to stop the situation, because she sees herself as an assertive person who would stand up for herself. The man might falsely remember that he tried to verify the woman’s consent (which he did not do), because he sees himself as a decent guy who would never rape a woman.
Conway and Ross referred to this self-serving memory distortion as “getting what you want by revising what you had.” On the larger stage of life, many of us do just that: We misremember our history as being worse than it was, thus distorting our perception of how much we have improved so that we’ll feel better about ourselves now.
An appreciation of the distortions of memory, a realization that even deeply felt memories might be wrong, might encourage people to hold their memories more lightly, drop the certainty that their memories are always accurate, and let go of the appealing impulse to use the past to justify problems of the present.
In their paper “The Seductive Appeal of Neuroscience Explanations,” Deena Weisberg and her colleagues demonstrated that if you give one group of laypeople a straightforward explanation of some behavior and another group the same explanation but with vague references to the brain thrown in (“brain scans indicate” or “the frontal-lobe brain circuitry known to be involved”), people assume the latter is more scientific—and therefore more real.
Scientific reasoning is useful to anyone in any job because it makes us face the possibility, even the dire reality, that we were mistaken. It forces us to confront our self-justifications and put them on public display for others to puncture. At its core, therefore, science is a form of arrogance control.
its discontents, was not doing science. For any theory to be scientific, it must be stated in such a way that it can be shown to be false as well as true.
Observation and intuition without independent verification are unreliable guides;
To guard against the bias of our own direct observations, scientists invented the control group: the group that isn’t getting the new therapeutic method, the patients who aren’t getting the new drug. Most people understand the importance of control groups in a study of a new drug’s effectiveness, because without a control group, you can’t say if people’s positive response is due to the drug or to the placebo effect—the general expectation that the drug will help them.
This change, from the uncritical “believe the children” to the more discerning “understand the children,” reflects a recognition that mental-health professionals need to think more like scientists and less like advocates; they must weigh all the evidence fairly and consider the possibility that their suspicions are unfounded. If they do not, it will not be justice that is served, but self-justification.
Are 10,193 to perhaps 50,193 wrongfully imprisoned citizens too many? Can we do better? How? There are no obvious answers. The good news is that the great majority of convicted criminal defendants in America are guilty. The bad news is that a substantial number are not.
Once officers believe that lying is defensible and even an essential aspect of the job, he adds, “dissonant feelings of hypocrisy no longer arise. The officer learns to rationalize lying as a moral act or at least as not an immoral act. Thus, his self-concept as a decent, moral person is not substantially compromised.”
The interrogator’s presumption of guilt creates a self-fulfilling prophecy. It makes the interrogator more aggressive, which in turn makes innocent suspects behave more suspiciously.
Kassin lectures widely to detectives and police officers to show them how their techniques of interrogation can backfire. They always nod knowingly, he says, and agree with him that false confessions are to be avoided. but then they immediately add that they themselves have never coerced anyone into a false confession. “How do you know?” Kassin asked one cop. “Because I never interrogate innocent people,” he said.
Like the students, they did no better than chance, yet they were convinced that their accuracy rate was close to 100 percent. Their experience and training did not improve their performance. Their experience and training simply increased their belief that it did.