More on this book
Community
Kindle Notes & Highlights
But by definition, there can’t be any particular feeling associated with simply being wrong. Indeed, the whole reason it’s possible to be wrong is that, while it is happening, you are oblivious to it. When you are simply going about your business in a state you will later decide was delusional, you have no idea of it whatsoever. You are like the coyote in the Road Runner cartoons, after he has gone off the cliff but before he has looked down.
Think about the telling fact that error literally doesn’t exist in the first person present tense: the sentence “I am wrong” describes a logical impossibility. As soon as we know that we are wrong, we aren’t wrong anymore, since to recognize a belief as false is to stop believing it. Thus we can only say “I was wrong.” Call it the Heisenberg Uncertainty Principle of Error: we can be wrong, or we can know it, but we can’t do both at the same time.
This is practical and efficient pedagogy, but it shores up our tacit assumption that current belief is identical with true belief, and it reinforces our generalized sense of rightness.
We already saw that “seeing the world as it is not” is pretty much the definition of erring—but it is also the essence of imagination, invention, and hope.
To err is to wander, and wandering is the way we discover the world; and, lost in thought, it is also the way we discover ourselves. Being right might be gratifying, but in the end it is static, a mere statement. Being wrong is hard and humbling, and sometimes even dangerous, but in the end it is a journey, and a story.
being wrong is often a side effect of a system that is functioning exactly right.
For the most part, though, the feeling of blankness is a lousy guide to ignorance—because, thanks to our aptitude for generating stories, stuff is almost always coming to mind. As a result, to know that we don’t know, we can’t just passively wait around to see if our mind comes up empty.
Some of us have voluble and inventive inner writers, some of us have meticulous inner fact-checkers, and a lucky few have both. Most of us, however, are noticeably better at generating theories than at registering our own ignorance. Hirstein says that once he began studying confabulation, he started seeing sub-clinical versions of it everywhere he looked, in the form of neurologically normal people “who seem unable to say the words, ‘I don’t know,’ and will quickly produce some sort of plausible-sounding response to whatever they are asked.”
In other words, if we want to discredit a belief, we will argue that it is advantageous, whereas if we want to champion it, we will argue that it is true. That’s why we downplay or dismiss the self-serving aspects of our own convictions, even as we are quick to detect them in other people’s beliefs. Psychologists refer to this asymmetry as “the bias blind spot.”
the Ignorance Assumption. Since we think our own beliefs are based on the facts, we conclude that people who disagree with us just haven’t been exposed to the right information, and that such exposure would inevitably bring them over to our team.
ignorance isn’t necessarily a vacuum waiting to be filled; just as often, it is a wall, actively maintained.
When other people reject our beliefs, we think they lack good information. When we reject their beliefs, we think we possess good judgment.
When the Ignorance Assumption fails us—when people stubbornly persist in disagreeing with us even after we’ve tried to enlighten them—we move on to the Idiocy Assumption.
This is the Evil Assumption—the idea that people who disagree with us are not ignorant of the truth, and not unable to comprehend it, but have willfully turned their backs on it.
Think about the accusation that people who disagree with us “don’t live in the real world.” What we really mean is that they don’t live in our model of the world; they don’t share our vision of how things are. By failing to see the world as we do, they actually are undermining its reality and threatening its destruction—at least, unto us. But, of course, we are doing the same to them. Implicitly or explicitly, we are denying that they possess the same intellectual and moral faculties that we do—and denying, too, the significance and value of their life experiences, from which, inevitably, many
...more
You can urge people not to believe anything based on meager evidence until you are blue in the face, but you will never succeed—because, as it turns out, believing things based on meager evidence is what people do. I don’t mean that we do this occasionally: only when we are thinking sloppily, say, or only if we don’t know any better, or only en route to screwing up. I mean that believing things based on paltry evidence is the engine that drives the entire miraculous machinery of human cognition.
induction is central to how we learn about people, including ourselves. Indeed, much of psychoanalytic theory is based on the belief that our earliest interactions with a tiny number of people permanently shape our theories about who we are, what other people are like, and what kind of treatment we can expect to receive throughout our lives.
But this is the paradox of inductive reasoning: although small amounts of evidence are sufficient to make us draw conclusions, they are seldom sufficient to make us revise them.
Far from making us reevaluate our beliefs, external opposition—especially opposition that we perceive as threatening or insulting—tends to make us dig our heels in even more.
This leads to something of a damned-if-you-do, damned-if-you-don’t predicament—because, as it turns out, not being exposed to external opposition can also make us grow more adamant about our beliefs.
We cannot imagine, or do not care, that our own certainty, when seen from the outside, must look just as unbecoming and ill-grounded as the certainty we abhor in others.
This is one of the most defining and dangerous characteristics of certainty: it is toxic to a shift in a perspective. If imagination is what enables us to conceive of and enjoy stories other than our own, and if empathy is the act of taking other people’s stories seriously, certainty deadens or destroys both qualities. When we are caught up in our own convictions, other people’s stories—which is to say, other people—cease to matter to us.
As with our own certainty, so too with theirs: we mistake it for a sign that they are right.
In a sense, the infamous polarization of the 2004 electorate could be boiled down to this: voters who were disquieted by changes of mind versus voters who were disquieted by impermeable conviction.
an enduring and troubling feature of our political culture. In politics, staying the course is admired (and changing direction is denigrated) intrinsically—that is, without regard to where the course itself might lead.
As the late renowned military historian Barbara Tuchman observed, “to recognize error, to cut losses, to alter course, is the most repugnant option in government.”
No matter how merited doubt and admissions of error might be, we loathe them in our political leaders, and associate them—inaccurately, but indissolubly—with weakness.
“What scientists never do when confronted by even severe and prolonged anomalies,” Kuhn wrote, “…. [is] renounce the paradigm that led them into crisis.” Instead, he concluded, “A scientific theory is declared invalid only if an alternate candidate is available to take its place.” That is, scientific theories very seldom collapse under the weight of their own inadequacy. They topple only when a new and seemingly better belief turns up to replace it.
“Our capacity to tolerate error,” Gadd said, “depends on our capacity to tolerate emotion.”
What happens in Plato’s Symposium is this: a bunch of guys go to a party, get drunk, and sit around bullshitting about love. (Don’t be fooled by today’s healthcare symposiums and technology symposiums and workplace-safety symposiums. In ancient Greek, the word specifically referred to a drinking bash.)
And we often treat our lovers as we treat our theories, rejecting one that isn’t quite working only when we have a new one to replace it. That’s part of why so many people have affairs by way of ending their relationships, or rocket into a “rebound relationship” after a difficult breakup.
This is the thing about intimate relationships: we sign up to share our lives with someone else, and sooner or later we realize that we are also living with another person’s reality. But we don’t particularly want to live with our partner’s reality. We just want him or her to second our own.
And yet, ironically, it’s mainly relinquishing this attachment to rightness that is difficult and uncomfortable—not, generally speaking, what happens afterward. This provides a crucial clue about the origins of our desire to be right. It isn’t that we care so fiercely about the substance of our claims. It is that we care about feeling affirmed, respected, and loved.
The moral here is obvious: we can learn to live with disagreement and error as long as we feel esteemed and loved.
To be judgmental, we must feel sure that we know right from wrong, and that we ourselves would never confuse the two. But the experience of erring shows us otherwise.
Here is Benjamin Franklin, just before appending his name to the most famous piece of parchment in American history: “I confess there are several parts of this Constitution which I do not at present approve, but I am not sure that I shall never approve them. For having lived long, I have experienced many instances of being obliged by better information, or fuller consideration, to change opinions even on important subjects, which I once thought right, but found to be otherwise.”
this kind of wrongness gets us started and keeps us going. Take away our willingness to overestimate ourselves, and we wouldn’t dare to undertake half the things we do.