More on this book
Community
Kindle Notes & Highlights
No, the bigger problem is that we’re proud of not knowing things. Americans have reached a point where ignorance, especially of anything related to public policy, is an actual virtue.
Principled, informed arguments are a sign of intellectual health and vitality in a democracy.
The foundational knowledge of the average American is now so low that it has crashed through the floor of “uninformed,” passed “misinformed” on the way down, and is now plummeting to “aggressively wrong.” People don’t just believe dumb things; they actively resist further learning rather than let go of those beliefs. I was not alive in the Middle Ages, so I cannot say it is unprecedented, but within my living memory I’ve never seen anything like it.
In later years, however, I started hearing the same stories from doctors. And from lawyers. And from teachers. And, as it turns out, from many other professionals whose advice is usually not contradicted easily. These stories astonished me: they were not about patients or clients asking sensible questions, but about those same patients and clients actively telling professionals why their advice was wrong. In every case, the idea that the expert knew what he or she was doing was dismissed almost out of hand.
There is a cult of ignorance in the United States, and there always has been. The strain of anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that “my ignorance is just as good as your knowledge.”
Thabo Mbeki, then the president of South Africa, seized on the idea that AIDS was caused not by a virus but by other factors, such as malnourishment and poor health, and so he rejected offers of drugs and other forms of assistance to combat HIV infection in South Africa. By the mid-2000s, his government relented, but not before Mbeki’s fixation on AIDS denialism ended up costing, by the estimates of doctors at the Harvard School of Public Health, well over three hundred thousand lives and the births of some thirty-five thousand HIV-positive children whose infections could have been avoided.1
...more
Actually, this is an understatement: the public not only expressed strong views, but respondents actually showed enthusiasm for military intervention in Ukraine in direct proportion to their lack of knowledge about Ukraine.
These are dangerous times. Never have so many people had so much access to so much knowledge and yet have been so resistant to learning anything.
Not only do increasing numbers of laypeople lack basic knowledge, they reject fundamental rules of evidence and refuse to learn how to make a logical argument. In doing so, they risk throwing away centuries of accumulated knowledge and undermining the practices and habits that allow us to develop new knowledge.
Stitch this cut in my leg, but don’t lecture me about my diet. (More than two-thirds of Americans are overweight.) Help me beat this tax problem, but don’t remind me that I should have a will. (Roughly half of Americans with children haven’t bothered to write one.) Keep my country safe, but don’t confuse me with the costs and calculations of national security. (Most US citizens do not have even a remote idea of how much the United States spends on its armed forces.)
The death of expertise is not just a rejection of existing knowledge. It is fundamentally a rejection of science and dispassionate rationality, which are the foundations of modern civilization.
Americans now believe that having equal rights in a political system also means that each person’s opinion about anything must be accepted as equal to anyone else’s.
And yet the result has not been a greater respect for knowledge, but the growth of an irrational conviction among Americans that everyone is as smart as everyone else. This is the opposite of education, which should aim to make people, no matter how smart or accomplished they are, learners for the rest of their lives. Rather, we now live in a society where the acquisition of even a little learning is the endpoint, rather than the beginning, of education. And this is a dangerous thing.
When students become valued clients instead of learners, they gain a great deal of self-esteem, but precious little knowledge; worse, they do not develop the habits of critical thinking that would allow them to continue to learn and to evaluate the kinds of complex issues on which they will have to deliberate and vote as citizens.
Citizens, to be sure, reinforce this reticence by arguing rather than questioning—an important difference—but that does not relieve experts of their duty to serve society and to think of their fellow citizens as their clients rather than as annoyances.
Even in the ancient world, democracies were known for their fascination with change and progress. Thucydides, for example, described the democratic Athenians of the fifth century b.c. as a restless people “addicted to innovation,” and centuries later, St. Paul found that the Athenians “spent their time doing nothing but talking about and listening to the latest ideas.”
Alexis de Tocqueville, the French observer who noted in 1835 that the denizens of the new United States were not exactly enamored of experts or their smarts. “In most of the operations of the mind,” he wrote, “each American appeals only to the individual effort of his own understanding.”
Fifty years later, the law professor Ilya Somin pointedly described how little had changed. Like Hofstadter before him, Somin wrote in 2015 that the “size and complexity of government” have made it “more difficult for voters with limited knowledge to monitor and evaluate the government’s many activities.
Indeed, ignorance has become hip, with some Americans now wearing their rejection of expert advice as a badge of cultural sophistication.
The CDC issued a report in 2012 that noted that raw dairy products were 150 times more likely than pasteurized products to cause food-borne illness. A Food and Drug Administration expert put it as bluntly as possible, calling the consumption of raw dairy the food equivalent of Russian roulette. None of this has swayed people who not only continue to ingest untreated products but who insist on giving it to the consumers who have no choice or ability to understand the debate: their children.
The fact of the matter is that experts are more often right than wrong, especially on essential matters of fact. And yet the public constantly searches for the loopholes in expert knowledge that will allow them to disregard all expert advice they don’t like.
Doctors routinely tussle with patients over drugs. Lawyers will describe clients losing money, and sometimes their freedom, because of unheeded advice. Teachers will relate stories of parents insisting that their children’s exam answers are right even when they’re demonstrably wrong. Realtors tell of clients who bought houses against their experienced advice and ended up trapped in a money pit.
Americans no longer distinguish the phrase “you’re wrong” from the phrase “you’re stupid.” To disagree is to disrespect. To correct another is to insult. And to refuse to acknowledge all views as worthy of consideration, no matter how fantastic or inane they are, is to be closed-minded.
The sad reality, however, is that the average American has no real idea how his or her money is spent. Polls upon polls show not only that Americans generally feel government spends too much, and taxes them too highly, but also that they are consistently wrong about who pays taxes, how much they pay, and where the money goes.
Or consider foreign aid. This is a hot-button issue among some Americans, who deride foreign aid as wasted money. Americans routinely believe, on average, that more than 25 percent of the national budget is given away as largesse in the form of foreign aid. In reality, that guess is not only wrong, but wildly wrong: foreign aid is a small fraction of the budget, less than three-quarters of 1 percent of the total expenditures of the United States of America.
Second, and related to this point about relative skill, experts will make mistakes, but they are far less likely to make mistakes than a layperson.
Knowing things is not the same as understanding them. Comprehension is not the same thing as analysis. Expertise is a not a parlor game played with factoids.
Some years ago, for example, I had a call from a gentleman who insisted that he had some important work that might be of use in our curriculum at the Naval War College. He’d been referred to me by a former student at another school, and he very much wanted me to read an important article about the Middle East. I asked who wrote the piece. Well, he answered, he did. He was a businessman, and he’d “read a lot.” I asked if he’d had any training in the subject, visited the region, or read any of the languages of the Middle East. He had no such background, he admitted, and then said, “But after
...more
The Dunning-Kruger Effect, in sum, means that the dumber you are, the more confident you are that you’re not actually dumb. Dunning and Kruger more gently label such people as “unskilled” or “incompetent.”
As it turns out, however, the more specific reason that unskilled or incompetent people overestimate their abilities far more than others is because they lack a key skill called “metacognition.”
In our work, we ask survey respondents if they are familiar with certain technical concepts from physics, biology, politics, and geography. A fair number claim familiarity with genuine terms like centripetal force and photon. But interestingly, they also claim some familiarity with concepts that are entirely made up, such as the plates of parallax, ultra-lipid, and cholarine. In one study, roughly 90 percent claimed some knowledge of at least one of the nine fictitious concepts we asked them about. Even worse, “the more well-versed respondents considered themselves in a general topic, the more
...more
To some extent, this is a problem not of general intelligence but of education. People simply do not understand numbers, risk, or probability, and few things can make discussion between experts and laypeople more frustrating than this “innumeracy,” as the mathematician John Allen Paulos memorably called it.
Conspiracy theories, by contrast, are frustrating precisely because they are so intricate. Each rejoinder or contradiction only produces a more complicated theory. Conspiracy theorists manipulate all tangible evidence to fit their explanation, but worse, they will also point to the absence of evidence as even stronger confirmation. After all, what better sign of a really effective conspiracy is there than a complete lack of any trace that the conspiracy exists? Facts, the absence of facts, contradictory facts: everything is proof. Nothing can ever challenge the underlying belief.
More important and more relevant to the death of expertise, however, is that conspiracy theories are deeply attractive to people who have a hard time making sense of a complicated world and who have no patience for less dramatic explanations.
Just as individuals facing grief and confusion look for reasons where none may exist, so, too, will entire societies gravitate toward outlandish theories when collectively subjected to a terrible national experience.
In 2014, for example, an international study reached a surprising conclusion: people will go to great lengths to give each other a fair hearing and to weigh all opinions equally, even when everyone involved in the conversation knows there are substantial differences in competence between them. The authors of the study (which included people from China, Iran, and Denmark) suggest that this is an “equality bias” built into us based on a human need to be accepted as part of a group.
The social psychologist Jonathan Haidt summed it up neatly when he observed that when facts conflict with our values, “almost everyone finds a way to stick with their values and reject the evidence.”
A 2014 Pew research study found that liberals are more likely than conservatives to block or unfriend people with whom they disagreed, but mostly because conservatives already tended to have fewer people with whom they disagreed in their online social circles in the first place. (Or as a Washington Post review of the study put it, conservatives have “lower levels of ideological diversity in their online ecosystem.”)
Fox’s history intersects with the death of expertise, however, in an important way: the arrival of Fox was, in its way, the ultimate expression of the partisan division in how people seek out sources of news in a new electronic marketplace.
The editor of the conservative journal First Things, R. R. Reno, wrote in 2016 that Roger Ailes was “perhaps the single most influential person behind the transformation of politics into entertainment over the last generation,” but that he’s since had plenty of help:
There is a generational difference at work here, as younger viewers are more likely than their elders to tune to a nontraditional source of information. But this morphing of news into entertainment stretches across every demographic. The whole exercise of staying informed has become a kind of postmodern exercise in irony and cynicism, with words like “truth” and “information” meaning whatever people want them to mean.
The story was in part based on studies that claim that one in four (sometimes reported as one in five) women in America’s colleges and universities will be sexually assaulted.
Similar to the “one in four” statistic is the now-common claim, repeated regularly in the American media, that US military veterans are killing themselves at an alarming rate because of the stress of fighting two major wars. “Twenty-two a day”—meaning twenty-two veteran suicides every twenty-four hours—has become the mantra both of veterans’ service organizations as well as antiwar groups. Multiple stories have appeared in electronic and print media about the “epidemic” of veteran suicide in 2013 and after, with dramatic headlines and pictures of young men and women in uniform who’d ended
...more
It is true, in fact, that veterans are killing themselves at higher rates in the twenty-first century than in earlier years. But in part, that’s because everybody has been killing themselves at higher rates—for reasons epidemiologists are still debating—and veterans are part of “everybody.”
A beleaguered Veterans Administration—not exactly the most popular bureaucracy in America—tried in vain to note that according to a sizable 2012 study, suicides among veterans really hadn’t changed all that much since 1999.
“There is a perception that we have a veterans’ suicide epidemic on our hands. I don’t think that is true,” said Robert Bossarte, the epidemiologist who conducted the study. “The rate is going up in the country, and veterans are a part of it.”26 Most of the stories didn’t bother with this quote, nor did they include important benchmarks like the overall suicide rate in America or the suicide rate among men in the same age cohort as the young combat veterans.
Be ecumenical. Vary your diet. You wouldn’t eat the same thing all day, so don’t consume the same sources of media all day. When I worked in national politics, I subscribed to a half-dozen journals at any given time, across the political spectrum. Don’t be provincial: try media from other countries, as they often report stories or have a view of which Americans are completely unaware. And don’t say you “don’t have the time.” You do.
Be less cynical—or don’t be so cynical. It’s extremely rare that anyone is setting out intentionally to lie to you.
People expect too much from expert prediction, but at least some experts are also willing to stand on their clairvoyance strongly enough to sell it. For decades, the political science professor Bruce Bueno de Mesquita has been using “proprietary software” to make predictions about world events for both public and private customers. His firm’s clients over some thirty years have included the US Central Intelligence Agency, which in a 1993 study said that in hundreds of predictions he “hit the bullseye” twice as often as its own analysts did.
Hedgehogs, for example, tended to be overly focused on generalizing their specific knowledge to situations that were outside of their competence, while foxes were better able to integrate more information and to change their minds when presented with new or better data. “The foxes’ self-critical, point-counterpoint style of thinking,” Tetlock found, “prevented them from building up the sorts of excessive enthusiasm for their predictions that hedgehogs, especially well-informed ones, displayed for theirs.”

