More on this book
Community
Kindle Notes & Highlights
Evil Assumption—the idea that people who disagree with us are not ignorant of the truth, and not unable to comprehend it, but have willfully turned their backs on it.
don’t care about what is logically valid and theoretically possible. We care about what is probable. We determine what is probable based on our prior experience of the world, which is where evidence comes in:
This strategy of guessing based on past experience is known as inductive reasoning.
inductions are necessarily impossible to substantiate. We can know that they are wrong—as Hume’s example turned out to be, when a species of black swan was discovered in Australia after his death—but we can’t know that they are right. All we know is that they are at least somewhat more likely to be right than the next best guess we could make.
Our options in life are not careful logical reasoning, through which we get things right, versus shoddy inductive reasoning, through which we get things wrong. Our options are inductive reasoning, which probably gets things right, and inductive reasoning, which—because it probably gets things right—sometimes gets things wrong. In other words, and appropriately enough, Descartes was half right. Believing something on the basis of messy, sparse, limited information really is how we err. But it is also how we think. What makes us right is what makes us wrong.
It makes us wrong in ways that are a complete embarrassment to our self-image as clear-eyed, fair-minded, conscientious, reasonable thinkers.
one of these inductive biases: leaping to conclusions.
Actually, as I noted above, leaping to conclusions is what we always do in inductive reasoning, but we generally only call it that when the process fails us—that is, when we leap to wrong conclusions.
paradox of inductive reasoning: although small amounts of evidence are sufficient to make us draw conclusions, they are seldom sufficient to make us revise them.
confirmation bias is the tendency to give more weight to evidence that confirms our beliefs than to evidence that challenges them.
Without some kind of belief system in place, he posited, we wouldn’t even know what kinds of questions to ask, let alone how to make sense of the answers.
science is full of examples of how faith in a theory has led people to the evidence, rather than evidence leading them to the theory.
The final form of confirmation bias I want to introduce is by far the most pervasive—and, partly for that reason, by far the most troubling. On the face of it, though, it seems like the most benign, because it requires no active shenanigans on our part—no refusal to believe the evidence, like Elizabeth O’Donovan, no dismissal of its relevance, like the stubborn Scotsman, no co-opting of it, like George Bush. Instead, this form of confirmation bias is entirely passive: we simply fail to look for any information that could contradict our beliefs.
Confirmation bias is also bolstered by the fact that looking for counterevidence often requires time, energy, learning, liberty, and sufficient social capital to weather the suspicion and derision of defenders of the status quo.
Speer himself shows us how to begin. I did not query, he wrote. I did not speak. I did not investigate. I closed my eyes. These are sins of omission, sins of passivity; and they suggest, correctly, that if we want to improve our relationship to evidence, we must take a more active role in how we think—must, in a sense, take the reins of our own minds.
Charles Darwin. In his autobiography, he recalled that, “I had, during many years, followed a golden rule, namely, that whenever a published fact, a new observation or thought came across me, which was opposed to my general results, to make a memorandum of it without fail and at once; for I had found by experience that such facts and thoughts were far more apt to escape from the memory than favorable ones.”
Like the earlier Bacon, then, the later one saw most errors as stemming from collective social forces rather than individual cognitive ones. This is a recurring theme in the history of wrongness.
whether we are more error-prone when we follow the masses or strike out on our own.
vast majority of our beliefs are really beliefs once removed. Our faith that we are right is faith that someone else is right.
First, our communities expose us to disproportionate support for our own ideas. Second, they shield us from the disagreement of outsiders. Third, they cause us to disregard whatever outside disagreement we do encounter. Finally, they quash the development of disagreement from within.
What really gets you into trouble with a community isn’t holding a belief it scorns; it is abandoning a belief it cherishes.
While insular groups are relatively immune to outside opinion, they are highly dependent on reinforcement of their belief system from within. As a result, internal dissent, unlike outside opposition, can be deeply destabilizing.
affirming and later rejecting a belief jeopardizes the whole paradigm of truth. As I argue throughout this book, our mistakes disturb us in part because they call into question not just our confidence in a single belief, but our confidence in the entire act of believing.
We would all like to believe that, had we lived in France during World War II, we would have been among those heroic souls fighting the Nazi occupation and helping ferry the persecuted to safety. The reality, though, is that only about 2 percent of French citizens actively participated in the Resistance.
(The very word “zealot” comes from a Greek root meaning to be jealous of the truth—to guard it as your own.)
Zealotry demands a complete rejection of the possibility of error.
remarkably, despite our generally supple, imaginative, extrapolation-happy minds, we cannot transpose this scene. We cannot imagine, or do not care, that our own certainty, when seen from the outside, must look just as unbecoming and ill-grounded as the certainty we abhor in others.
If imagination is what enables us to conceive of and enjoy stories other than our own, and if empathy is the act of taking other people’s stories seriously, certainty deadens or destroys both qualities.
Doubt, it seems, is a skill—and one that, as we saw earlier, needs to be learned and honed. Credulity, by contrast, appears to be something very like an instinct.
it feels safe and pleasurable to be steadfast in our convictions. But we also find other people’s certainty deeply attractive.
Our commitment to an idea, he concluded, “is healthiest when it is not without doubt, but in spite of doubt.”
the moment when the feeling of being right seroconverts to the feeling of being wrong. Psychologically as well as structurally, this moment forms the central experience of error. It is here that some part of our past self gives up the ghost, and some part of the person we will become begins to stir. As that suggests, this moment is crucial to our moral and intellectual development.
For the most part, our beliefs change either too quickly or too slowly to isolate the actual encounter with error.
In updating the past to accord with the present, we eliminate the necessity (and the possibility) of confronting our mistakes. If we think we’ve always believed what we believe right now, there is no friction, no change, no error, and above all no uncomfortably different past self to account for.
If gradual belief change protects us from the experience of error by attenuating it virtually out of existence, sudden belief change does the opposite: it condenses that experience almost to the vanishing point.
We vault over the actual experience of wrongness so quickly that the only evidence that we erred is that something inside us has changed.
scientific theories very seldom collapse under the weight of their own inadequacy. They topple only when a new and seemingly better belief turns up to replace it.
perhaps the chief thing we learn from being wrong is how much growing up we still have to do.
Sometimes people succeed in showing us our errors and sometimes they fail, but only rarely does that success or failure hinge on the accuracy of their information. Instead, as we saw in our discussion of community, it has almost everything to do with the interpersonal forces at work: trust or mistrust, attraction or repulsion, identification or alienation. It’s no surprise, then, that other people’s input is often insufficient to make us recognize our mistakes.
The fact is, with the exception of our own minds, no power on earth has the consistent and absolute ability to convince us that we are wrong. However much we might be prompted by cues from other people or our environment, the choice to face up to error is ultimately ours alone.
almost no matter what we were wrong about, we can find countless different ways to take the measure of our mistakes.
we all incline toward conservatism when it comes to determining the size of our mistakes.
time-frame defense is that, however wrong I seem, I am actually more right than those who currently look it: I am a visionary, able to see the lay of the land from a more distant and loftier (i.e., Godlike) perspective.
near-miss defense. Here the claim is not that our prediction will come to pass eventually, but rather that it almost came to pass.
Boiled down to its indisputably true but patently absurd essence, the argument of the near-miss defense is that if I hadn’t been wrong, I would have been right.
out-of-left-field defense. The claim here is that I was on track to being absolutely right
when—bang!—some bizarre and unforeseeable event derailed the natural course of things and ren...
This highlight has been truncated due to consecutive passage length restrictions.
most popular Wrong Buts of all time: blaming other people.
(“I was wrong, but it’s your fault.”)
“better safe than sorry” defense.