More on this book
Community
Kindle Notes & Highlights
by
Carol Tavris
Read between
April 30 - April 30, 2020
when Johnson came to believe in something, he would believe in it “totally, with absolute conviction, regardless of previous beliefs, or of the facts in the matter.”
The results are always the same: severe initiations increase a member’s liking for the group.
If, on your way to join a discussion group, a flowerpot fell from the open window of an apartment building and hit you on the head, you would not like that discussion group any better. But if you volunteered to get hit on the head by a flowerpot to become a member of the group, you would definitely like the group more.
Drew Westen and his colleagues found that the reasoning areas of the brain virtually shut down when participants were confronted with dissonant information, and the emotion circuits of the brain were activated when consonance was restored.13 These mechanisms provide a neurological basis for the observation that once our minds are made up, it is hard to change them.
People who receive disconfirming or otherwise unwelcome information often do not simply resist it; they may come to support their original (wrong) opinion even more strongly—a backfire effect. Once we are invested in a belief and have justified its wisdom, changing our minds is literally hard work. It’s much easier to slot that new evidence into an existing framework and do the mental justification to keep it there than it is to change the framework.
People become more certain they are right about something they just did if they can’t undo it.
The more costly a decision in terms of time, money, effort, or inconvenience, and the more irrevocable its consequences, the greater the dissonance and the greater the need to reduce it by overemphasizing the good things about the choice made.
Behavioral economists have shown how reluctant people are to accept these sunk costs—investments of time or money that they’ve sunk into an experience or relationship. Rather than cutting their losses, most people will throw good money after bad in hopes of recouping those losses and justifying their original decision.
Actually, decades of experimental research have found exactly the opposite: when people vent their feelings aggressively, they often feel worse, pump up their blood pressure, and make themselves even angrier.
While serving in the Pennsylvania legislature, Franklin was disturbed by the opposition and animosity of a fellow legislator. So he set out to win him over. He didn’t do it, he wrote, by “paying any servile respect to him”—that is, by doing the other man a favor—but by inducing his target to do a favor for him. (He asked the man to loan him a rare book from his library.)
But when an expert is wrong, the centerpiece of his or her professional identity is threatened. Therefore, dissonance theory predicts that the more self-confident and famous experts are, the less likely they will be to admit mistakes.
Our convictions about who we are carry us through the day, and we are constantly interpreting the things that happen to us through the filter of those core beliefs. When those beliefs are violated, even by a good experience, it causes us discomfort.
It is as if they started off at the top of a pyramid, a millimeter apart, but by the time they have finished justifying their individual actions, they have slid to the bottom and now stand at opposite corners of its base. The one who didn’t cheat considers the other to be totally immoral, and the one who cheated thinks the other is hopelessly puritanical.
It’s the people who almost decide to live in glass houses who throw the first stones.
He redefined cheating as “taking a risk.”
Once he was in the White House, he went along with all of the small ethical compromises that just about all politicians justify in the goal of serving their party.
“I know what I have done, and Your Honor knows what I have done. Somewhere between my ambition and my ideals, I lost my ethical compass.” How do you get an honest man to lose his ethical compass? You get him to take one step at a time, and self-justification will do the rest.
Safire’s ability to recognize his own dissonance and resolve it by doing the fair thing is rare. As we will see, his willingness to concede that his own side made a mistake is something that few are prepared to do. Instead, conservatives and liberals alike will bend over backward to reduce dissonance in a way that is favorable to them and their team. The specific tactics vary, but our efforts at self-justification are all designed to serve our need to feel good about what we have done, what we believe, and who we are.
One, people who are open-minded and fair ought to agree with a reasonable opinion, and, two, any opinion I hold must be reasonable; if it weren’t, I wouldn’t hold it.
Democrats will endorse an extremely restrictive welfare proposal, one usually associated with Republicans, if they think it has been proposed by the Democratic Party, and Republicans will support a generous welfare policy if they think it comes from the Republican Party.
All of us are as unaware of our blind spots as fish are unaware of the water they swim in, but those who swim in the waters of privilege have a particular motivation to remain oblivious.
The greatest of faults, I should say, is to be conscious of none.
Conflict of interest and politics are synonymous,
imagine handing over your discovery to the public interest without keeping a few million bucks for yourself.
Throughout the 1980s, the ideological climate shifted from one in which science was valued for its own sake or for the public interest to one in which science was valued for the profits it could generate in the private interest.
The greater danger to the public comes from the self-justifications of well-intentioned scientists and physicians who, because of their need to reduce dissonance, truly believe themselves to be above the influence of their corporate funders. Yet, like a plant turning toward the sun, they turn toward the interests of their sponsors without even being aware that they are doing so.
subtle effects of sponsorship
According to surveys, physicians regard small gifts as being ethically more acceptable than large gifts.
Once you take the gift, no matter how small, the process starts. You will feel the urge to give something back, even if it’s only, at first, your attention, your willingness to listen, your sympathy for the giver. Eventually, you will become more willing to give your prescription, your ruling, your vote.
Pharmaceutical and biotechnology industries are offering consulting fees, contracts, and honoraria to bioethicists, the very people who write about, among other things, the dangers of conflicts of interest between physicians and drug companies.
There’s a clever dissonance-reducing claim for you—“Perfect objectivity is impossible anyway, so I might as well accept that consulting fee.”
“In normal circumstances,” wrote Hitler’s henchman Albert Speer in his memoirs, “people who turn their backs on reality are soon set straight by the mockery and criticism of those around them, which makes them aware they have lost credibility. In the Third Reich there were no such correctives, especially for those who belonged to the upper stratum. On the contrary, every self-deception was multiplied as in a hall of distorting mirrors, becoming a repeatedly confirmed picture of a fantastical dream world which no longer bore any relationship to the grim outside world. In those mirrors I could
...more
Our greatest hope of self-correction lies in making sure we are not operating in a hall of mirrors, in which all we see are distorted reflections of our own desires and convictions. We need a few trusted naysayers in our lives, critics who are willing to puncture our protective bubble of self-justifications and yank us back to reality if we veer too far off. This is especially important for people in positions of power.
Social psychologist Anthony Greenwald has described the self as being ruled by a “totalitarian ego” that ruthlessly destroys information it doesn’t want to hear and, like all fascist leaders, rewrites history from the standpoint of the victor.
But dissonance theory predicts that we will conveniently forget good arguments made by an opponent, just as we forget foolish arguments made by our own side.
“‘I have done that,’ says my memory. ‘I cannot have done that,’ says my pride, and remains inexorable. Eventually—memory yields.”
Every parent has been an unwilling player in the you-can’t-win game. Require your daughter to take piano lessons, and later she will complain that you wrecked her love of the piano. Let your daughter give up lessons because she doesn’t want to practice, and later she will complain that you should have forced her to keep going—why, now she can’t play the piano at all.
Parent blaming is a popular and convenient form of self-justification because it allows people to live less uncomfortably with their regrets and imperfections.
False memories allow people to forgive themselves and justify their mistakes, but sometimes at a high price: an inability to take responsibility for their lives.
But we must also be careful which memories we select to justify our lives, because we will have to live by them.
we felt foolish and embarrassed that we had sacrificed our scientific skepticism on the altar of outrage.
Naturally, not all scientists are scientific—that is, open-minded and willing to give up their strong convictions or admit that conflicts of interest might taint their research. But even when an individual scientist is not self-correcting, science eventually is.
Yet the inherent privacy of the interaction means that therapists who lack training in science and skepticism have no internal corrections to the self-protecting cognitive biases that afflict us all.
Scientific reasoning is useful to anyone in any job because it makes us face the possibility, even the dire reality, that we were mistaken. It forces us to confront our self-justifications and put them on public display for others to puncture. At its core, therefore, science is a form of arrogance control.
For any theory to be scientific, it must be stated in such a way that it can be shown to be false as well as true.