Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts
Rate it:
Open Preview
Kindle Notes & Highlights
0%
Flag icon
We are all capable of believing things which we know to be untrue, and then, when we are finally proved wrong, impudently twisting the facts so as to show that we were right.
1%
Flag icon
It seems like eons since Republican nominee Bob Dole described Bill Clinton as “my opponent, not my enemy,” but in fact he made that civilized remark in 1996. How quaint it now seems in contrast to Donald Trump, who regards his opponents (or people who simply disagree with him) as treasonous, disloyal rats and foes.
1%
Flag icon
Most people, when directly confronted by evidence that they are wrong, do not change their point of view or plan of action but justify it even more tenaciously.
2%
Flag icon
a report issued shortly thereafter by sixteen American intelligence agencies announced that the occupation of Iraq had actually increased Islamic radicalism and the risk of terrorism. Yet Bush said to a delegation of conservative columnists, “I’ve never been more convinced that the decisions I made are the right decisions.”
4%
Flag icon
It’s fascinating, and sometimes funny, to read doomsday predictions, but it’s even more fascinating to watch what happens to the reasoning of true believers when the prediction flops and the world keeps muddling along. Notice that hardly anyone ever says, “I blew it! I can’t believe how stupid I was to believe that nonsense”? On the contrary, most of the time the doomsayers become even more deeply convinced of their powers of prediction.
6%
Flag icon
The confirmation bias even sees to it that no evidence—the absence of evidence—is evidence for what we believe.
9%
Flag icon
Experts can sound pretty impressive, especially when they bolster their claims by citing their years of training and experience in a field. Yet hundreds of studies have shown that, compared to predictions based on actuarial data, predictions based on an expert’s years of training and personal experience are rarely better than chance.
9%
Flag icon
It’s the people who almost decide to live in glass houses who throw the first stones.
11%
Flag icon
As we will see, his willingness to concede that his own side made a mistake is something that few are prepared to do. Instead, conservatives and liberals alike will bend over backward to reduce dissonance in a way that is favorable to them and their team.
11%
Flag icon
But according to a study of 4,519 votes by Supreme Court justices in over five hundred cases between 1953 and 2011, the justices were more likely to support freedom of speech for speakers whose speech they agreed with; conservative members of the Roberts court ruled in favor of conservative speakers about 65 percent of the time and liberal speakers about 21 percent. The gap for liberal justices was not as great, more like 10 percent, but they too were more likely to vote in support of speakers whose political philosophy they shared.
14%
Flag icon
The reason Big Pharma spends so much on small gifts as well as the big ones is well known to marketers, lobbyists, and social psychologists: being given a gift evokes an implicit desire to reciprocate.
15%
Flag icon
All of us recognize variation within our own gender, party, ethnicity, or nation, but we are inclined to generalize about people in other categories and lump them all together as them.
16%
Flag icon
A stereotype might bend or even shatter under the weight of disconfirming information, but the hallmark of prejudice is that it is impervious to reason, experience, and counterexample.
16%
Flag icon
Once people have a prejudice, just as once they have a political ideology, they do not easily drop it, even if the evidence indisputably contradicts a core justification for it. Rather, they come up with another justification to preserve their belief or rationalize a course of action.
17%
Flag icon
Social psychologists Chris Crandall and Amy Eshelman, reviewing the huge research literature on prejudice, found that whenever people are emotionally depleted—when they are sleepy, frustrated, angry, anxious, drunk, or stressed—they become more willing to express their real prejudices toward another group.
25%
Flag icon
When Grace expressed doubts that her recovered memories were true, the therapist replied: “You’re sicker than you ever were.”
26%
Flag icon
Yet while psychiatrists learn about the brain, many still learn almost nothing about nonmedical causes of emotional disorders or about the questioning, skeptical essence of science.
26%
Flag icon
Yet the inherent privacy of the interaction means that therapists who lack training in science and skepticism have no internal corrections to the self-protecting cognitive biases that afflict us all.
27%
Flag icon
As research psychologist John Kihlstrom observed, “The weakness of the relationship between accuracy and confidence is one of the best-documented phenomena in the 100-year history of eyewitness memory research,”14 but van der Kolk was unaware of a finding that just about every undergraduate who has taken Psychology 101 would know.
28%
Flag icon
The scientific method consists of the use of procedures designed to show not that our predictions and hypotheses are right, but that they might be wrong.
28%
Flag icon
For any theory to be scientific, it must be stated in such a way that it can be shown to be false as well as true.
28%
Flag icon
Because of the confirmation bias, however, the “dependable observation” is not dependable. Clinical intuition—“I know it when I see it”—is the end of the conversation to many psychiatrists and psychotherapists but the start of the conversation to the scientist: “A good observation, but what exactly have you seen, and how do you know you are right?”
28%
Flag icon
What unites these clinical practitioners is their misplaced reliance on their own powers of observation and the closed loop it creates. Everything they see confirms what they believe. One danger of the closed loop is that it makes practitioners vulnerable to logical fallacies.
29%
Flag icon
“The notion that the mind protects itself by repressing or dissociating memories of trauma, rendering them inaccessible to awareness, is a piece of psychiatric folklore devoid of convincing empirical support.”21 Overwhelmingly, the evidence shows just the opposite. The problem for most people who have suffered traumatic experiences is not that they forget them but that they cannot forget them; the memories keep intruding.
31%
Flag icon
The researchers sampled many groups of professional psychologists and psychotherapists and found that the more scientifically trained they were, the more accurate their beliefs about memory and trauma. Among members of the Society for a Science of Clinical Psychology, only 17.7 percent believed that “traumatic memories are often repressed.” Among general psychotherapists, it was 60 percent; among psychoanalysts, 69 percent; and among neurolinguistic programming therapists and hypnotherapists, 81 percent—which is about the same percentage found in the general public.
35%
Flag icon
Once a detective decides that he or she has found the killer, the confirmation bias sees to it that the prime suspect becomes the only suspect. And if the prime suspect happens to be innocent, too bad—he’s still on the ropes.
35%
Flag icon
Doing whatever it takes to convict someone leads to ignoring or discounting evidence that would require officers to change their minds about a suspect. In extreme cases, it can tempt individual officers and even entire departments to cross the line from legal to illegal actions.
36%
Flag icon
Because police culture generally supports these justifications, it becomes even harder for an individual officer to resist breaking (or bending) the rules. Eventually, many cops will take the next steps, proselytizing other officers, persuading them to join them in a little innocent rule-breaking, and shunning or sabotaging officers who do not go along—and who are a reminder of the moral road not taken.
36%
Flag icon
Norm Stamper, a police officer for thirty-four years and former chief of the Seattle Police Department, has written that there isn’t a major police force in the country “that has escaped the problem: cops, sworn to uphold the law, [are] seizing and converting drugs to their own use [and] planting dope on suspects.”
36%
Flag icon
What’s wrong with that is that there is nothing to prevent the police from planting evidence and committing perjury to convict someone they believe is guilty—but who is innocent.
36%
Flag icon
The problem is that once they have decided on a likely suspect, they don’t think it’s possible that he or she is innocent. Therefore, once they have a suspect, they behave in ways to confirm that initial judgment of guilt, justifying the techniques they use in the belief that only guilty people will be vulnerable to them.
37%
Flag icon
Kassin lectures widely to detectives and police officers to show them how their techniques of interrogation can backfire. They always nod knowingly, he says, and agree with him that false confessions are to be avoided. but then they immediately add that they themselves have never coerced anyone into a false confession. “How do you know?” Kassin asked one cop. “Because I never interrogate innocent people,” he said.
37%
Flag icon
In the next phase of training, detectives become confident in their ability to read the suspect’s nonverbal cues: eye contact, body language, posture, hand gestures, and vehemence of denials. If the person won’t look you in the eye, the manual explains, that’s a sign of lying. If the person slouches (or sits rigidly), those are signs of lying. If the person denies guilt, that’s a sign of lying. Yet the Reid Technique advises interrogators to “deny suspect eye contact.” Deny a suspect the direct eye contact that they themselves regard as evidence of innocence?
37%
Flag icon
Promoters of the manual claim that their method trains investigators to determine whether someone is telling the truth or lying with an 80 to 85 percent level of accuracy. There is simply no scientific support for this claim.
38%
Flag icon
The response of prosecutors in Florida is typical. After more than 130 prisoners had been freed by DNA testing in the space of fifteen years, prosecutors decided they would respond by mounting a vigorous challenge to similar new cases. Convicted rapist Wilton Dedge had to sue the state to have the evidence in his case retested, over the fierce objections of prosecutors who said that the state’s interest in finality and the victim’s feelings should supersede concerns about Dedge’s possible innocence.42 Dedge was ultimately exonerated and released.
38%
Flag icon
That is why prosecutors interpret the same evidence in one of two ways, depending on when it is discovered. Early in an investigation, the police use DNA to confirm a suspect’s guilt or rule the person out. But when DNA tests are conducted after a defendant has been indicted and convicted, the prosecutors typically dismiss DNA results as irrelevant, not important enough to reopen the case.
39%
Flag icon
But from our vantage point, the greatest impediment to admitting and correcting mistakes in the criminal justice system is that most of its members reduce dissonance by denying that there is a problem.
39%
Flag icon
Currently, the professional training of most police officers, detectives, judges, and attorneys includes almost no information about their own cognitive biases; how to correct for them, as much as possible; and how to manage the dissonance they will feel when their beliefs meet disconfirming evidence. On the contrary, much of what they learn about psychology comes from self-proclaimed experts with no training in psychological science and who, as we saw, do not teach them to be more accurate in their judgments, merely more confident that they are accurate:
39%
Flag icon
Many judges, jurors, and police officers prefer certainties to science. Law professor D. Michael Risinger and attorney Jeffrey L. Loop have lamented “the general failure of the law to reflect virtually any of the insights of modern research on the characteristics of human perception, cognition, memory, inference or decision under uncertainty, either in the structure of the rules of evidence themselves, or the ways in which judges are trained or instructed to administer them.”
39%
Flag icon
We believe a snitch when he gives us information that helps us send someone to prison for life, but when he challenges our basic beliefs about the system, his allegations are promptly denied as nonsense without a closer look.”
39%
Flag icon
leading social scientists who have studied wrongful conviction are unanimous in recommending safeguards, such as the electronic recording of all interviews. As of 2019, only twenty-six states plus the District of Columbia require the police to electronically record interrogations in some or all felony crimes, although only five states stipulate a “preference” for audiovisual recording.54 Police and prosecutors have long resisted this requirement, fearing, perhaps, the embarrassing, dissonance-generating revelations it might create.
39%
Flag icon
Ralph Lacer, one of the interrogators of Bradley Page, justified the police position against videos on the grounds that a recording “is inhibiting” and makes it “hard to get at the truth.”
40%
Flag icon
But according to legal scholars and social scientists Deborah Davis and Richard Leo, American law enforcement remains steeped in its traditions, including adherence to the Reid Technique and similar procedures, maintaining “near absolute denial” that these techniques can and do produce false confessions and wrongful convictions.
50%
Flag icon
In his investigation of documented cases of abuse of prisoners, Conroy found that almost every military or police official he interviewed, whether British, South African, Israeli, or American, justified their practices by saying, in effect, “Our torture is never as severe and deadly as their torture”:
51%
Flag icon
How did the creators of CIA policy and those who carried it out reduce the dissonance caused by the information that the United States had been systematically violating the Geneva Conventions? The first way is to say that if we do it, it isn’t torture.
51%
Flag icon
Indeed, the Senate Intelligence report confirmed that no information gained from torturing detainees had proved useful in capturing or killing any terrorist, including Osama bin Laden.
52%
Flag icon
“Nothing predicts future behavior as much as past impunity.”
53%
Flag icon
People who are insecure in their religious beliefs may feel the impulse to silence and harass those who disagree with them, because the mere existence of those naysayers arouses the painful dissonance of doubt.
54%
Flag icon
The last American president to tell the country he had made a mistake that had disastrous consequences was John F. Kennedy in 1961.
54%
Flag icon
Whatever the error, sin, or mistake, apologies fail when listeners know that the speaker has to say something to reassure the public but the statement feels formulaic and obligatory (which it often is, having been generated by a press agent or someone in human resources).
« Prev 1