More on this book
Community
Kindle Notes & Highlights
Read between
November 16 - December 8, 2017
And yet, if fallibility is built into our very name and nature, it is in much the same way the puppet is built into the jack-in-the-box: in theory wholly predictable, in practice always a jarring surprise.
I become right squared, maybe even right factorial, logarithmically right—at any rate, really, extremely right, and really, extremely delighted about it.
Pessimistic Meta-Induction from the History of Science. The gist is this: because even the most seemingly bulletproof scientific theories of times past eventually proved wrong, we must assume that today’s theories will someday prove wrong as well. And what goes for science goes in general—for politics, economics, technology, law, religion, medicine, child-rearing, education. No matter the domain of life, one generation’s verities so often become the next generation’s falsehoods that we might as well have a Pessimistic Meta-Induction from the History of Everything.
Henri Bergson argued against “imprisoning the comic spirit within a definition.” Instead, he wrote, he hoped to provide his readers with “something more flexible than an abstract definition—a practical, intimate acquaintance,
But by definition, there can’t be any particular feeling associated with simply being wrong.
So I should revise myself: it does feel like something to be wrong. It feels like being right.
To err is to wander, and wandering is the way we discover the world; and, lost in thought, it is also the way we discover ourselves. Being right might be gratifying, but in the end it is static, a mere statement. Being wrong is hard and humbling, and sometimes even dangerous, but in the end it is a journey, and a story. Who really wants to stay home and be right when you can don your armor, spring up on your steed and go forth to explore the world?
To fuck up is to find adventure: it is in that spirit that this book is written.
A lady once asked me if I believed in ghosts and apparitions. I answered with truth and simplicity, No, madam, I have seen far too many myself. —SAMUEL TAYLOR COLERIDGE
it belongs to a group of similar neurological problems collectively known as anosognosia, or the denial of disease.
National tragedy is good to memory researchers. In 1986, when the space shuttle Challenger exploded, Neisser saw an opportunity to remedy this gap in the memory literature, and to find out whether his own mistaken Pearl Harbor recollection was an anomaly. He surveyed his students about their memories of the disaster the day after it happened, and then again three years later. The results spelled the end of conventional flashbulb memory theory. Less than 7 percent of the second reports matched the initial ones, 50 percent were wrong in two-thirds of their assertions, and 25 percent were wrong
...more
The answer is that they are confabulating. To confabulate means, basically, to make stuff up; the most relevant etymological ghost is the word “fable.” The confabulations that arise from brain damage are spontaneous fables.
(In our dreams, too, we are paralyzed but think we can move, and blind but think we can see.)
The experiment is known as the false belief test, or, informally, as the Sally-Ann task, after the most famous of its many variations.
By the age of five, virtually all children can pass the Sally-Ann test with ease. In coming to do so, these children have acquired what developmental psychologists call “representational theory of mind.”
“It simply does not make sense to see myself as both believing that P is true”—where “P” stands for any proposition—“and being convinced that I do so for reasons having nothing to do with P’s being true.”
every one of us confuses our models of the world with the world itself—not occasionally or accidentally but necessarily.
If we assume that people who are wrong are ignorant, or idiotic, or evil—well, small wonder that we prefer not to confront the possibility of error in ourselves.
Descartes defined error not as believing something that isn’t true, but as believing something based on insufficient evidence.
We can know that they are wrong—as Hume’s example turned out to be, when a species of black swan was discovered in Australia after his death—but we can’t know that they are right.
Leonard Susskind, who is a professor of theoretical physics at Stanford University, a member of the National Academy of Sciences, and one of the founders of string theory. All of that makes him about as expert as you can get in the domain of science, yet here he is on one of its fundamental principles: “If I were to flip a coin a million times, I’d be damn sure I wasn’t going to get all heads,” he once wrote. “I’m not a betting man, but I’d be so sure that I’d bet my life or my soul on it…. I’m absolutely certain that the laws of large numbers—probability theory—will work and protect me. All
...more
The philosopher Avishai Margalit put this nicely. “It is not the case that I am caught in a web of beliefs,” he wrote. “…Rather, I am caught in a network of witnesses.”
In the actual experiment, however, there was a hitch: only one of the people in the room was really a subject. The others were working for Asch (“stooges,” in psych-experiment parlance), and, per his instructions, after the first few flashcards, they all began to give the same wrong answer. The consequences for the lone authentic subject were striking. Three-quarters of them gave the wrong answer at least once, and one-quarter gave the wrong answer for half or more of the flashcards. On average, the subjects’ error rate rose from under 1 percent when acting independently to almost 37 percent
...more
For that, we’re better off taking some not-very-mannerly advice from a not-very-mannerly source: the magician and comedian Penn Jillette, who would have appalled the genteel Post, and who once dismissed most conventional etiquette as “bullshit” on his TV show of that name. In an interview for AskMen.com, Jillette attacked the notion that “shutting up about what you believe is showing tolerance to other people.” On the contrary, he said, “I believe shutting up about what you believe is a way to stay close-minded, a way not to be busted. If you have some crazy thought and keep it in your head,
...more
My favorite example, however, comes from the Talmud, the rabbinical writings that serve as a commentary on the Torah and the basis of Orthodox Judaism. According to these writings, if there is a unanimous guilty verdict in a death penalty case, the defendant must be allowed to go free—a provision intended to ensure that, in matters so serious that someone’s life is on the line, at least one person has prevented groupthink by providing a dissenting opinion.
The people around us prevent us from believing things that are (as Penn Jillette put it) “fucking nuts,” while our own inner voice keeps rising up and breaking the surface tension that could otherwise turn a community into a bubble.
It’s one thing to doubt the existence of Santa Claus, another thing to doubt the accuracy of a news story, and a third thing to doubt the accuracy of a news story you yourself wrote.
consider the following sentence: “armadillos may be lured from a thicket with soft cheese.”
William Hirstein even suggests that, when it comes to those in power, we often feel that “an answer that is possibly (or even probably) wrong is better than none at all.” Translation: we are more alarmed by leaders who waver than by those who screw up.
John Maynard Keynes: “When the facts change, I change my mind. What do you do, sir?”
Our commitment to an idea, he concluded, “is healthiest when it is not without doubt, but in spite of doubt.”
a kind of forcing function—the engineer’s term of art for features of the physical world that alert us to the fact that we are making a mistake. If
Once you have missed the first buttonhole you’ll never manage to button up. —JOHANN WOLFGANG VON GOETHE
at least 20 percent of seriously ill people who are told that they are near death actually forget the news within a few days—a form of denial so extreme that it involves not simply rejecting but entirely obliterating unwanted information.
As Sartre wrote, to be self-deceived, “I have to know this truth very precisely in order to hide it from myself the more carefully.”
That’s how we know we are alive: we’re wrong. —PHILLIP ROTH, AMERICAN PASTORAL
the condition that led Nietzsche to refer to us, wonderingly, as “hybrids of plants and of ghosts.”
We humans can try to imagine the world as rendered by sonar, just as we can try to imagine flying around in the dark, dining on insects, and spending our days sleeping upside down in the attic. As Nagel noted, however, this exercise “tells me only what it would be like for me to behave as a bat behaves. But that is not the question. I want to know what it is like for a bat to be a bat.”
our failure to understand another being’s inner reality doesn’t make that reality any less real, or any less valuable to that being.
someone once asked the South African writer J. M. Coetzee to name his favorite novel. Coetzee replied that it was Daniel Defoe’s Robinson Crusoe—because, he explained, the story of a man alone on an island is the only story there is.
François de La Rochefoucauld observed, “Everyone complains about their memory; no one complains about their judgment.”
According to the Institute of Medicine, between 690,000 and 748,000 patients are affected by medical errors in the United States every year, and between 44,000 and 98,000 die from them. Even the lowball estimate makes medical mistakes the eighth leading cause of death in the nation—worse than breast cancer, AIDS, and motor vehicle accidents.
For commercial aviation to take the same toll in the United States as medical errors do, a sold-out 747 would have to crash every three days, killing everyone on board.
A company that has achieved Six Sigma experiences just 3.4 such errors per million opportunities to err, a laudably low failure rate (or, framed positively, a 99.9997 percent success rate). To get a sense of what this means, consider that a company that ships 300,000 packages per year with a 99 percent success rate sends 3,000 packages to the wrong place. If that same company achieved Six Sigma, only a single package would go astray.
With Six Sigma, then, the goal isn’t to improve the average per se, but to reduce the deviation from that average.
“define, measure, analyze, improve, control.”
but the capacity for negation and refusal comes to us very, very early—typically within the first twelve to eighteen months of life. It is soon followed (or, less often, preceded) by “yes,” and, for a while, that’s the kind of world we live in: a black-and-white, yes-and-no universe. Psychologists call this developmental stage “splitting.”* Right around the age of five, though, something interesting happens: we learn the word “maybe.” This first tentative foray marks the beginning of our ability to acknowledge, quantify, and talk about uncertainty. As such, it also marks a major step toward
...more
John Francis—who, in 1973, took what became a seventeen-year vow of silence.
Listening only in order to contradict, argue, and accuse: that reflex will be painfully familiar to many of us. Choosing not to speak might be an extreme countermeasure, but choosing to listen wouldn’t hurt.
Here is Benjamin Franklin, just before appending his name to the most famous piece of parchment in American history: “I confess there are several parts of this Constitution which I do not at present approve, but I am not sure that I shall never approve them. For having lived long, I have experienced many instances of being obliged by better information, or fuller consideration, to change opinions even on important subjects, which I once thought right, but found to be otherwise.” No speech could have been more appropriate for the founding of the United States, a nation established in no small
...more