More on this book
Community
Kindle Notes & Highlights
by
Tom Nichols
Read between
September 22 - September 23, 2024
No, the bigger problem is that we’re proud of not knowing things. Americans have reached a point where ignorance, especially of anything related to public policy, is an actual virtue. To reject the advice of experts is to assert autonomy, a way for Americans to insulate their increasingly fragile egos from ever being told they’re wrong about anything. It is a new Declaration of Independence: no longer do we hold these truths to be self-evident, we hold all truths to be self-evident, even the ones that aren’t true. All things are knowable and every opinion on any subject is as good as any
...more
The foundational knowledge of the average American is now so low that it has crashed through the floor of “uninformed,” passed “misinformed” on the way down, and is now plummeting to “aggressively wrong.” People don’t just believe dumb things; they actively resist further learning rather than let go of those beliefs.
But there is a self-righteousness and fury to this new rejection of expertise that suggest, at least to me, that this isn’t just mistrust or questioning or the pursuit of alternatives: it is narcissism, coupled to a disdain for expertise as some sort of exercise in self-actualization.
We are supposed to “agree to disagree,” a phrase now used indiscriminately as little more than a conversational fire extinguisher.
There is a cult of ignorance in the United States, and there always has been. The strain of anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that “my ignorance is just as good as your knowledge.” Isaac Asimov
Americans now believe that having equal rights in a political system also means that each person’s opinion about anything must be accepted as equal to anyone else’s.
Not only is the Internet making many of us dumber, it’s making us meaner: alone behind their keyboards, people argue rather than discuss, and insult rather than listen.
Despite what many people think, experts and policymakers are not the same people, and to confuse the two, as Americans often do, corrodes trust among experts, citizens, and political leaders, all of whom must cooperate to make democracy work
Besides, a certain amount of conflict between people who know some things and people who know other things is inevitable. There were probably arguments between the first hunters and gatherers over what to have for dinner.
In the original American populistic dream, the omnicompetence of the common man was fundamental and indispensable. It was believed that he could, without much special preparation, pursue the professions and run the government. Today, he knows that he cannot even make his breakfast without using devices, more or less mysterious to him, which expertise has put at his disposal; and when he sits down to breakfast and looks at his morning newspaper, he reads about a whole range of issues and acknowledges, if he is candid with himself, that he has not acquired competence to judge most of them.
At the root of all this is an inability among laypeople to understand that experts being wrong on occasion about certain issues is not the same thing as experts being wrong consistently on everything. The fact of the matter is that experts are more often right than wrong, especially on essential matters of fact. And yet the public constantly searches for loopholes in expert knowledge that will allow them to disregard all expert advice they don’t like.
Political debate and the making of public policy are not science. They are rooted in conflict, sometimes conducted as respectful disagreement but more often as a hockey game with no referees and a standing invitation for spectators to rush onto the ice.
Specialized knowledge is inherent in every occupation, and so here I will use the words “professionals,” “intellectuals,” and “experts” interchangeably, in the broader sense of people who have mastered particular skills or bodies of knowledge and who practice those skills or use that knowledge as their main occupation in life.
So how do we distinguish these experts among us, and how do we identify them? True expertise, the kind of knowledge on which others rely, is an intangible but recognizable combination of education, talent, experience, and peer affirmation. Each of these is a mark of expertise, but most people would rightly judge how all of them are combined in a given subject or professional field when deciding whose advice to trust.
Still, credentials are a start. They carry the imprimatur of the institutions that bestow them, and they are a signal of quality, just as consumer brands tend to promote (and, hopefully, protect) the quality of their products.
Talent separates those who have gained a credential from people who have a deeper feel or understanding of their area of expertise.
Indeed, asking about “experience” is another way of asking the age-old question: “What have you done lately?” Experts stay engaged in their field, continually improve their skills, learn from their mistakes, and have visible track records.
Another mark of true experts is their acceptance of evaluation and correction by other experts.
This self-policing is central to the concept of professionalism and is another way we can identify experts.
Expert communities rely on these peer-run institutions to maintain standards and to enhance social trust. Mechanisms such as peer review, board certification, professional associations, and other organizations and professions help to protect quality and to assure society—that is, the expert’s clients—that they’re safe in accepting expert claims of competence.
When you take an elevator to the top floor of a tall building, the certificate in the elevator does not say “good luck up there”; it says that a civic authority, relying on engineers educated and examined by other engineers, have looked at that box and know, with as much certainty as anyone can, that you’ll be safe.
First, while our clumsy dentist might not be the best tooth puller in town, he or she is better at it than you.
Second, and related to this point about relative skill, experts will make mistakes, but they are far less likely to make mistakes than a layperson. This is a crucial distinction between experts and everyone else, in that experts know better than anyone the pitfalls of their own profession.
Knowing things is not the same as understanding them. Comprehension is not the same thing as analysis. Expertise is a not a parlor game played with factoids. And while there are self-trained experts, they are rare exceptions.
No one’s knowledge is complete, and experts realize this better than anyone. But education, training, practice, experience, and affirmation by others in the same field should provide us with at least a rough guide to dividing experts from the rest of society.
Yeah, well, you know, that’s just, like, your opinion, man. “The Dude,” The Big Lebowski
Public debate over almost everything devolves into trench warfare, in which the most important goal is to establish that the other person is wrong.
We all have an inherent and natural tendency to search for evidence that already meshes with our beliefs.
This phenomenon is called “the Dunning-Kruger Effect,” named for David Dunning and Justin Kruger, the research psychologists at Cornell University who identified it in a landmark 1999 study. The Dunning-Kruger Effect, in sum, means that the dumber you are, the more confident you are that you’re not dumb. Dunning and Kruger more gently label such people as “unskilled” or “incompetent.” But that doesn’t change their central finding: “Not only do they reach erroneous conclusions and make unfortunate choices, but their incompetence robs them of the ability to realize it.”
In other words, the least-competent people were the least likely to know they were wrong or to know that others were right, the most likely to try to fake it, and the least able to learn anything.
We are gripped by irrational fear rather than irrational optimism because confirmation bias is, in a way, a kind of survival mechanism. Good things come and go, but dying is forever.
When we fight confirmation bias, we’re trying to correct for a basic function—a feature, not a bug—of the human mind. Whether the question is mortal peril or one of life’s daily dilemmas, confirmation bias comes into play because people must rely on what they already know. They cannot approach every problem as though their minds are clean slates. This is not the way memory works, and more to the point, it would hardly be an effective strategy to begin every morning trying to figure everything out from scratch.
Thus, confirmation bias makes attempts at reasoned argument exhausting because it produces arguments and theories that are nonfalsifiable.
“Old wives’ tales” and other superstitions are classic examples of confirmation bias and nonfalsifiable arguments. Many superstitions have some kind of grounding in experience. While it’s a superstition that you shouldn’t walk under ladders, for example, it’s also true that it’s dangerous to walk under a ladder. Whether you’ll have bad luck all day by annoying a house painter is a different matter, but it’s just foolish to walk under his ladder.
One reason we all love a good conspiracy thriller is that it appeals to our sense of heroism. A brave individual against a great conspiracy, fighting forces that would defeat the ordinary person, is a trope as old as the many legends of heroes themselves. American culture loves the idea of the talented amateur (as opposed, say, to the experts and elites) who can take on entire governments—or even bigger organizations—and win.
More important and more relevant to the death of expertise, however, is that conspiracy theories are deeply attractive to people who have a hard time making sense of a complicated world and who have no patience for less dramatic explanations.
Conspiracy theories are also a way for people to give context and meaning to events that frighten them. Without a coherent explanation for why terrible things happen to innocent people, they would have to accept such occurrences as nothing more than the random cruelty of either an uncaring universe or an incomprehensible deity. These are awful choices, and even thinking about them can induce the kind of existential despair that leads a character in the nineteenth-century classic The Brothers Karamazov to make a famous declaration about tragedy: “If the sufferings of children go to make up the
...more
Whatever it is, somebody is at fault, because otherwise we’re left blaming only God, pure chance, or ourselves.
But even people who do not believe in hallucinatory theories often harbor a daily unwillingness to accept expert advice, and this resistance is rooted in the same kind of populist suspicion of those perceived as smarter or more educated than the general public. The damage might be less dramatic, but it is no less tangible, and sometimes quite costly.
People resist generalizations—boys tend to be like this, girls tend to be like that—because we all want to believe we’re unique and that we cannot be pigeonholed that easily.
Stereotypes are not predictions, they’re conclusions.
That’s why one of the most important characteristics of an expert is the ability to remain dispassionate, even on the most controversial issues.
we don’t like to tell people we know or care about that they’re wrong. (At least not to their face.) Likewise, as much as we enjoy the natural feeling of being right about something, we’re sometimes reluctant to defend our actual expertise.
other a fair hearing and to weigh all opinions equally, even when everyone involved in the conversation knows there are substantial differences in competence between them. The authors of the study (which included people from China, Iran, and Denmark) suggest that this is an “equality bias” built into us, based on a human need to be accepted as part of a group.
People skim headlines or articles and share them on social media, but they do not read them. Nonetheless, because people want to be perceived by others as intelligent and well informed, they fake it as best they can.
Higher education is supposed to cure us of the false belief that everyone is as smart as everyone else. Unfortunately, in the twenty-first century the effect of widespread college attendance is just the opposite: the great number of people who have been in or near a college think of themselves as the educated peers of even the most accomplished scholars and experts.
Still, the fact of the matter is that many of those American higher educational institutions are failing to provide to their students the basic knowledge and skills that form expertise. More important, they are failing to provide the ability to recognize expertise and to engage productively with experts and other professionals in daily life. The most important of these intellectual capabilities, and the one most under attack in American universities, is critical thinking: the ability to examine new information and competing ideas dispassionately, logically, and without emotional or personal
...more
At its best, college should aim to produce graduates with a reasonable background in a subject, a willingness to continue learning for the rest of their lives, and an ability to assume roles as capable citizens. Instead, for many people college has become, in the words of a graduate of a well-known party school in California, “those magical seven years between high school and your first warehouse job.” College is no longer a passage to educated maturity and instead is only a delaying tactic against the onset of adulthood—in some cases, for the faculty as well as for the students. Part of the
...more
Not only are there too many students, there are too many professors. The very best national universities, the traditional sources of university faculty, are promiscuously pumping out PhDs at a rate far higher than any academic job market can possibly absorb.
Schools and colleges have created a destructive spiral of credential inflation the same way governments cause monetary inflation: by printing more paper.

