More on this book
Community
Kindle Notes & Highlights
Read between
June 18 - July 2, 2018
The model of the world—the belief—is vital, but awareness of the belief is dispensable. In fact, lack of awareness is the norm. From the anticipated behavior of inanimate objects to the presumed identity of our parents to whether we can see a mountain chain or see at all, the vast majority of our mental models are implicit: entirely unfelt, yet essential to how we make sense of ourselves and the world.
Theorizing, like fornicating, is of timeless use to our species.
we’re so emotionally invested in our beliefs that we are unable or unwilling to recognize them as anything but the inviolable truth. (The very word “believe” comes from an Old English verb meaning “to hold dear,” which suggests, correctly, that we have a habit of falling in love with our beliefs once we’ve formed them.)
As the Princeton psychologist Emily Pronin and her colleagues observed in a study of the bias blind spot, “we are not particularly comforted when others assure us that they have looked into their own hearts and minds and concluded that they have been fair and objective.”
So we look into our hearts and see objectivity; we look into our minds and see rationality; we look at our beliefs and see reality. This is the essence of the ’Cuz It’s True Constraint: every one of us confuses our models of the world with the world itself—not occasionally or accidentally but necessarily. This is a powerful phenomenon, and it sets in motion a cascade of corollaries that determines how we deal with challenges to our belief systems—not, alas, for the better.
The first such corollary is the Ignorance Assumption. Since we think our own beliefs are based on the facts, we conclude that people who disagree with us just haven’t been exposed to the right information, and that such exposure would inevitably bring them over to our team. This assumption is extraordinarily widespread. To cite only the most obvious examples, all religious evangelism and a good deal of political activism (especially grass-roots activism) is premised on the conviction that you can change people’s beliefs by educating them on the issues.
roots activism) is premised on the conviction that you can change people's beliefs by educating them on the issues.
When other people reject our beliefs, we think they lack good information. When we reject their beliefs, we think we possess good judgment.
Of these three assumptions—the Ignorance Assumption, the Idiocy Assumption, and the Evil Assumption—the last is the most disturbing. But the first is the most decisive. We assume that other people are ignorant because we assume that we are not; we think we know the facts, and (as the ’Cuz It’s True Constraint mandates) we think those facts determine our beliefs. Put differently, we think the evidence is on our side. It is almost impossible to overstate the centrality of that conviction to everything this book is about, which is why we are going to turn next to the topic of evidence. Our faith
...more
Descartes defined error not as believing something that isn’t true, but as believing something based on insufficient evidence.
This strategy of guessing based on past experience is known as inductive reasoning. As we’ve seen, inductive reasoning makes us vastly better than computers at solving problems like the ones in this quiz.
our beliefs are not necessarily true. Instead, they are probabilistically true. This point was made (and made famous) by the philosopher David Hume, who was arguably the first thinker to fully grasp both the import and the limitation of inductive reasoning. To borrow his much-cited example: How can I be sure that all swans are white if I myself have seen only a tiny fraction of all the swans that have ever existed? If the world were always uniform, consistent, and predictable, I could trust that I am right about this induction (and about all inductions). Unfortunately, as Hume noted, nothing
...more
inductions are necessarily impossible to substantiate. We can know that they are wrong—as Hume’s example turned out to be, when a species of black swan was discovered in Australia after his death—but we can’t know that they are right. All we know is that they are at least somewhat more likely to be right than the next best guess we could make.
Language, categorization, causality, psychology: without expertise in these domains, we would be sunk. And without inductive reasoning—the capacity to reach very big conclusions based on very little data—we would never gain that expertise. However slapdash it might initially seem, this best-guess style of reasoning is critical to human intelligence. In fact, these days, inductive reasoning is the leading candidate for actually being human intelligence.
But score a point for Descartes: inductive reasoning also makes us fundamentally, unavoidably fallible.
This is the lesson we learned with optical illusions, and it is the fundamental lesson of inductive reasoning as well: our mistakes are part and parcel of our brilliance, not the regrettable consequences of a separate and deplorable process. Our options in life are not careful logical reasoning, through which we get things right, versus shoddy inductive reasoning, through which we get things wrong. Our options are inductive reasoning, which probably gets things right, and inductive reasoning, which—because it probably gets things right—sometimes gets things wrong. In other words, and
...more
theories represent the beginning of science as much as its endpoint. Theories tell us what questions to ask (“Why is Uranus’s orbit out of whack?”), what kind of answers to look for (“Something really big must be exerting a gravitational pull on it.”) and where to find them (“According to Newtonian calculations, that big thing must be over there.”). They also tell us what not to look for and what questions not to ask, which is why those astronomers didn’t bother looking for a giant intergalactic battleship warping Uranus’s orbit instead. These are invaluable directives, prerequisite to doing
...more
As all this suggests, our relationship to evidence is seldom purely a cognitive one. Vilifying menstruating women, bolstering anti-Muslim stereotypes, murdering innocent citizens of Salem: plainly, evidence is almost invariably a political, social, and moral issue as well.
By indirection, Speer himself shows us how to begin. I did not query, he wrote. I did not speak. I did not investigate. I closed my eyes. These are sins of omission, sins of passivity; and they suggest, correctly, that if we want to improve our relationship to evidence, we must take a more active role in how we think—must, in a sense, take the reins of our own minds. To do this, we must query and speak and investigate and open our eyes. Specifically, and crucially, we must learn to actively combat our inductive biases: to deliberately seek out evidence that challenges our beliefs, and to take
...more
To Bacon’s mind, all error could be chalked up to just four problems, which he called (rather charmingly, to English speakers) offendicula: impediments or obstacles to truth. One of those obstacles was a kind of thirteenth-century version of Modern Jackass: the tendency to cover up one’s own ignorance with the pretense of knowledge. Another was the persuasive power of authority. A third was blind adherence to custom, and the last was the influence of popular opinion.
For Francis Bacon, too, there were four major sources of human error, which he called the four idols. The idol of the Tribe roughly corresponds to the terrain I covered in the last three chapters: universal, species-wide cognitive habits that can lead us into error. The idol of the Cave refers to chauvinism—the tendency to distrust or dismiss all peoples and beliefs foreign to our own clan. The idol of the Marketplace is analogous to what the earlier Bacon called the influence of public opinion, and includes the potentially misleading effects of language and rhetoric. The last idol, that of
...more
Thinking for oneself is, beyond a doubt, a laudable goal. But there are three problems with the idea that it is a good way to ward off error. The first is that the glorification of independent thought can easily become a refuge for holders of utterly oddball beliefs. You can dismiss any quantity of informed and intelligent adversaries if you choose to regard them as victims of a collective, crowd-driven madness, while casting yourself as the lone voice of truth. The second problem is that (as we have seen), our own direct observations and experiences are not necessarily more trustworthy than
...more
The last and most significant problem with the idea that we should always think for ourselves is that, bluntly put, we can’t. Every one of us is profoundly dependent on other people’s minds—so profoundly that if we took seriously the charge to think for ourselves, we would have to relinquish our faith in the vast majority of the things we think we know.
The vast majority of our beliefs are really beliefs once removed. Our faith that we are right is faith that someone else is right.
This reliance on other people’s knowledge—those around us as well as those who came before us—is, on balance, a very good thing. Life is short, and most of us don’t want to spend any more of it than absolutely necessary trying to independently verify the facts about turnips. Relying on other people to do that work buys us all a lot of time. It also buys us, in essence, many billions of prosthetic brains. Thanks to other people’s knowledge, I know a bit about what Thomas Jefferson was like in person, how it feels to climb Mount Everest, and what kind of creatures live in the depths of the
...more
The philosopher Avishai Margalit put this nicely. “It is not the case that I am caught in a web of beliefs,” he wrote. “…Rather, I am caught in a network of witnesses.” Our relationships to these “witnesses”—the people and institutions that attest to the truth of various beliefs—predate and determine our reaction to whatever information they supply. As Margalit said, “my belief in [one of these witnesses] is prior to my belief that (what she says is true).”
As countless commentators have observed, this lends to our beliefs an element of the arbitrary. Montaigne, for instance, remarked that people “are swept [into a belief]—either by the custom of their country or by their parental upbringing, or by chance—as by a tempest, without judgment or choice, indeed most often before the age of discretion.”* This claim is at once obvious and irksome, not least because it is directly at odds with the ’Cuz It’s True Constraint. If we think we believe our beliefs based on the facts, we aren’t likely to appreciate the alternative theory that we actually
...more
My favorite example, however, comes from the Talmud, the rabbinical writings that serve as a commentary on the Torah and the basis of Orthodox Judaism. According to these writings, if there is a unanimous guilty verdict in a death penalty case, the defendant must be allowed to go free—a provision intended to ensure that, in matters so serious that someone’s life is on the line, at least one person has prevented groupthink by providing a dissenting opinion.
If a single person breaking ranks on a single belief can threaten the cohesion of an entire community, it can also—and perhaps even more alarmingly—threaten the entire nature of believing. This is the point I gestured toward at the beginning of this chapter: if our beliefs can change when we cross a border (or meet a Catholic aid worker), then truth comes to seem like nothing more than a local perspective. That’s disturbing, because the whole point of truth is that it is supposed to be universal. Shahnawaz Farooqui, a Muslim journalist and commentator who supported the death penalty for Abdul
...more
This is error-blindness as a moral problem: we can’t always know, today, which of our current beliefs will someday come to seem ethically indefensible—to us, or to history.
Properly speaking, there is no certainty; there are only people who are certain. —CHARLES RENOUVIER, ESSAIS DE CRITIQUE GÉNÉRALE
What zealots have in common, then, is the absolute conviction that they are right. In fact, of all the symbolic ones and zeros that extremists use to write their ideological binary codes—us/them, same/different, good/evil—the fundamental one is right/wrong. Zealotry demands a complete rejection of the possibility of error. The conviction that we cannot possibly be wrong: this is certainty.
The certainty that we sometimes see channeled toward malevolent ends is not, in its essence, different from the flare of righteous anger that causes each of us to think, mid-argument, that it is only the other person who is irrational, unyielding, and wrong.
certainty suggests something bigger and more forceful than knowledge. The great American satirist Ambrose Bierce defined it as “being mistaken at the top of one’s voice,”
The feeling of knowing, then, is less a synonym for certainty than a precondition for it. And we have encountered other preconditions as well. There are our sensory perceptions, so immediate and convincing that they seem beyond dispute. There is the logical necessity, captured by the ’Cuz It’s True Constraint, of thinking that our beliefs are grounded in the facts. There are the biases we bring to bear when we assess the evidence for and against those beliefs. And there is the fact that our convictions and our communities are mutually reinforcing, so that we can’t question our beliefs without
...more
All of these factors conduce to the condition of certainty—even as they should caution us against it. We have seen, after all, that knowledge is a bankrupt category and that the feeling of knowing is not a reliable indicator of accuracy. We have seen that our senses can fail us, our minds mislead us, our communities blind us. And we have seen, too, that certainty can be a moral catastrophe waiting to happen. Moreover, we often recoil from the certainty of others even when they aren’t using it to excuse injustice or violence. The certainty of those with whom we disagree—whether the disagreement
...more
By contrast, we experience our own certainty as simply a side-effect of our rightness, justifiable because our cause is just. And, remarkably, despite our generally supple, imaginative, extrapolation-happy minds, we cannot transpose this scene. We cannot imagine, or do not care, that our own certainty, when seen from the outside, must look just as unbecoming and ill-grounded as the certainty we abhor in others.
If imagination is what enables us to conceive of and enjoy stories other than our own, and if empathy is the act of taking other people’s stories seriously, certainty deadens or destroys both qualities.
When we are caught up in our own convictions, other people’s stories—which is to say, other people—cease to matter to us. This happens on the scale of history (a specific person’s story is always irrelevant to zealots, unless it serves the ends of the group), but it also happens to each of us as individuals. If you doubt it, listen to yourself the next time you argue with a family member. Leaving behind our more thoughtful and generous selves, we become smug, or patronizing, or scornful, or downright bellicose. And that’s when we are fighting with people we love.
you will be hard-pressed to find a skeptical mollusk.
Doubt, it seems, is a skill—and one that, as we saw earlier, needs to be learned and honed. Credulity, by contrast, appears to be something very like an instinct.
As with our own certainty, so too with theirs: we mistake it for a sign that they are right.
the right painted him as a tergiversator.
riposte of John Maynard Keynes: “When the facts change, I change my mind. What do you do, sir?”
The psychologist Rollo May once wrote about the “seeming contradiction that we must be fully committed, but we must also be aware at the same time that we might possibly be wrong.”
Our commitment to an idea, he concluded, “is healthiest when it is not without doubt, but in spite of doubt.”
Remember the Warner Brothers coyote, the one who runs off the cliff but doesn’t fall until he looks down? Certainty is our way of not looking down.
A thousand years before her birth, al-Ghazali, the Persian philosopher, meditated on precisely this problem. Of the irreversibility of breaking with past beliefs, he wrote, “There can be no desire to return to servile conformism once it has been abandoned, since a prerequisite for being a servile conformist is that one does not know [oneself] to be such.” But when someone recognizes his former beliefs as false, al-Ghazali continued, “the glass of his servile conformism is shattered—an irreparable fragmentation and a mess which cannot be mended by patching and piecing together.” Instead, he
...more
The problem is that, as with the feeling of rightness, our investment in a belief (or conversely, our indifference to it) has no necessary relationship to its truth. No amount of sunk costs can make an erroneous belief accurate, just as fixing the flat on a junky car can’t make it un-junky. But our sunk costs do have a keen relationship to our loyalty. The more we spend on a belief, the harder it is to extricate ourselves from it. As Anita put it, “there’s a continuum of things you can be wrong about, and some of them are bearable, and some of them are not. I can’t really accept the
...more
By contrast, the wisdom we perceive in the elderly often stems from their hard-earned knowledge that no one knows everything. In the long haul, they recognize, all of us screw up, misunderstand ideas, misjudge situations, underestimate other people, overestimate ourselves—and all of this over and over again. In this respect, their sagacity is a form of humility, one that enables a less rigid relationship to the world.