Being Wrong: Adventures in the Margin of Error
Rate it:
Open Preview
22%
Flag icon
Evil Assumption—the idea that people who disagree with us are not ignorant of the truth, and not unable to comprehend it, but have willfully turned their backs on it.
24%
Flag icon
don’t care about what is logically valid and theoretically possible. We care about what is probable. We determine what is probable based on our prior experience of the world, which is where evidence comes in:
24%
Flag icon
This strategy of guessing based on past experience is known as inductive reasoning.
24%
Flag icon
inductions are necessarily impossible to substantiate. We can know that they are wrong—as Hume’s example turned out to be, when a species of black swan was discovered in Australia after his death—but we can’t know that they are right. All we know is that they are at least somewhat more likely to be right than the next best guess we could make.
25%
Flag icon
Our options in life are not careful logical reasoning, through which we get things right, versus shoddy inductive reasoning, through which we get things wrong. Our options are inductive reasoning, which probably gets things right, and inductive reasoning, which—because it probably gets things right—sometimes gets things wrong. In other words, and appropriately enough, Descartes was half right. Believing something on the basis of messy, sparse, limited information really is how we err. But it is also how we think. What makes us right is what makes us wrong.
25%
Flag icon
It makes us wrong in ways that are a complete embarrassment to our self-image as clear-eyed, fair-minded, conscientious, reasonable thinkers.
25%
Flag icon
one of these inductive biases: leaping to conclusions.
25%
Flag icon
Actually, as I noted above, leaping to conclusions is what we always do in inductive reasoning, but we generally only call it that when the process fails us—that is, when we leap to wrong conclusions.
26%
Flag icon
paradox of inductive reasoning: although small amounts of evidence are sufficient to make us draw conclusions, they are seldom sufficient to make us revise them.
26%
Flag icon
confirmation bias is the tendency to give more weight to evidence that confirms our beliefs than to evidence that challenges them.
26%
Flag icon
Without some kind of belief system in place, he posited, we wouldn’t even know what kinds of questions to ask, let alone how to make sense of the answers.
26%
Flag icon
science is full of examples of how faith in a theory has led people to the evidence, rather than evidence leading them to the theory.
27%
Flag icon
The final form of confirmation bias I want to introduce is by far the most pervasive—and, partly for that reason, by far the most troubling. On the face of it, though, it seems like the most benign, because it requires no active shenanigans on our part—no refusal to believe the evidence, like Elizabeth O’Donovan, no dismissal of its relevance, like the stubborn Scotsman, no co-opting of it, like George Bush. Instead, this form of confirmation bias is entirely passive: we simply fail to look for any information that could contradict our beliefs.
27%
Flag icon
Confirmation bias is also bolstered by the fact that looking for counterevidence often requires time, energy, learning, liberty, and sufficient social capital to weather the suspicion and derision of defenders of the status quo.
27%
Flag icon
Speer himself shows us how to begin. I did not query, he wrote. I did not speak. I did not investigate. I closed my eyes. These are sins of omission, sins of passivity; and they suggest, correctly, that if we want to improve our relationship to evidence, we must take a more active role in how we think—must, in a sense, take the reins of our own minds.
27%
Flag icon
Charles Darwin. In his autobiography, he recalled that, “I had, during many years, followed a golden rule, namely, that whenever a published fact, a new observation or thought came across me, which was opposed to my general results, to make a memorandum of it without fail and at once; for I had found by experience that such facts and thoughts were far more apt to escape from the memory than favorable ones.”
28%
Flag icon
Like the earlier Bacon, then, the later one saw most errors as stemming from collective social forces rather than individual cognitive ones. This is a recurring theme in the history of wrongness.
28%
Flag icon
whether we are more error-prone when we follow the masses or strike out on our own.
29%
Flag icon
vast majority of our beliefs are really beliefs once removed. Our faith that we are right is faith that someone else is right.
31%
Flag icon
First, our communities expose us to disproportionate support for our own ideas. Second, they shield us from the disagreement of outsiders. Third, they cause us to disregard whatever outside disagreement we do encounter. Finally, they quash the development of disagreement from within.
32%
Flag icon
What really gets you into trouble with a community isn’t holding a belief it scorns; it is abandoning a belief it cherishes.
32%
Flag icon
While insular groups are relatively immune to outside opinion, they are highly dependent on reinforcement of their belief system from within. As a result, internal dissent, unlike outside opposition, can be deeply destabilizing.
32%
Flag icon
affirming and later rejecting a belief jeopardizes the whole paradigm of truth. As I argue throughout this book, our mistakes disturb us in part because they call into question not just our confidence in a single belief, but our confidence in the entire act of believing.
33%
Flag icon
We would all like to believe that, had we lived in France during World War II, we would have been among those heroic souls fighting the Nazi occupation and helping ferry the persecuted to safety. The reality, though, is that only about 2 percent of French citizens actively participated in the Resistance.
33%
Flag icon
(The very word “zealot” comes from a Greek root meaning to be jealous of the truth—to guard it as your own.)
33%
Flag icon
Zealotry demands a complete rejection of the possibility of error.
34%
Flag icon
remarkably, despite our generally supple, imaginative, extrapolation-happy minds, we cannot transpose this scene. We cannot imagine, or do not care, that our own certainty, when seen from the outside, must look just as unbecoming and ill-grounded as the certainty we abhor in others.
34%
Flag icon
If imagination is what enables us to conceive of and enjoy stories other than our own, and if empathy is the act of taking other people’s stories seriously, certainty deadens or destroys both qualities.
34%
Flag icon
Doubt, it seems, is a skill—and one that, as we saw earlier, needs to be learned and honed. Credulity, by contrast, appears to be something very like an instinct.
36%
Flag icon
it feels safe and pleasurable to be steadfast in our convictions. But we also find other people’s certainty deeply attractive.
37%
Flag icon
Our commitment to an idea, he concluded, “is healthiest when it is not without doubt, but in spite of doubt.”
37%
Flag icon
the moment when the feeling of being right seroconverts to the feeling of being wrong. Psychologically as well as structurally, this moment forms the central experience of error. It is here that some part of our past self gives up the ghost, and some part of the person we will become begins to stir. As that suggests, this moment is crucial to our moral and intellectual development.
38%
Flag icon
For the most part, our beliefs change either too quickly or too slowly to isolate the actual encounter with error.
38%
Flag icon
In updating the past to accord with the present, we eliminate the necessity (and the possibility) of confronting our mistakes. If we think we’ve always believed what we believe right now, there is no friction, no change, no error, and above all no uncomfortably different past self to account for.
38%
Flag icon
If gradual belief change protects us from the experience of error by attenuating it virtually out of existence, sudden belief change does the opposite: it condenses that experience almost to the vanishing point.
38%
Flag icon
We vault over the actual experience of wrongness so quickly that the only evidence that we erred is that something inside us has changed.
38%
Flag icon
scientific theories very seldom collapse under the weight of their own inadequacy. They topple only when a new and seemingly better belief turns up to replace it.
39%
Flag icon
perhaps the chief thing we learn from being wrong is how much growing up we still have to do.
40%
Flag icon
Sometimes people succeed in showing us our errors and sometimes they fail, but only rarely does that success or failure hinge on the accuracy of their information. Instead, as we saw in our discussion of community, it has almost everything to do with the interpersonal forces at work: trust or mistrust, attraction or repulsion, identification or alienation. It’s no surprise, then, that other people’s input is often insufficient to make us recognize our mistakes.
40%
Flag icon
The fact is, with the exception of our own minds, no power on earth has the consistent and absolute ability to convince us that we are wrong. However much we might be prompted by cues from other people or our environment, the choice to face up to error is ultimately ours alone.
43%
Flag icon
almost no matter what we were wrong about, we can find countless different ways to take the measure of our mistakes.
43%
Flag icon
we all incline toward conservatism when it comes to determining the size of our mistakes.
44%
Flag icon
time-frame defense is that, however wrong I seem, I am actually more right than those who currently look it: I am a visionary, able to see the lay of the land from a more distant and loftier (i.e., Godlike) perspective.
44%
Flag icon
near-miss defense. Here the claim is not that our prediction will come to pass eventually, but rather that it almost came to pass.
44%
Flag icon
Boiled down to its indisputably true but patently absurd essence, the argument of the near-miss defense is that if I hadn’t been wrong, I would have been right.
44%
Flag icon
out-of-left-field defense. The claim here is that I was on track to being absolutely right
44%
Flag icon
when—bang!—some bizarre and unforeseeable event derailed the natural course of things and ren...
This highlight has been truncated due to consecutive passage length restrictions.
44%
Flag icon
most popular Wrong Buts of all time: blaming other people.
44%
Flag icon
(“I was wrong, but it’s your fault.”)
44%
Flag icon
“better safe than sorry” defense.