More on this book
Community
Kindle Notes & Highlights
Read between
March 5 - April 8, 2021
There is a theory which states that if ever anyone discovers exactly what the Universe is for and why it is here, it will instantly disappear and be replaced by something even more bizarre and inexplicable. There is another which states that this has already happened. —Douglas Adams
The claims of evolutionary biology and a literal interpretation of Genesis could not both be true at the same time. This meant questioning and doubting. I had to figure out which narratives to believe and which ones to reject. This is the essence of skepticism: How do we know what to believe and what to doubt?
Fully realizing this essence of the human condition can make you cynical, denying all knowledge. That, however, is just another bias, another narrative to help us deal with the chaos. Cynicism is also cheap—it’s easy just to doubt or deny everything. But it doesn’t get you or society anywhere. Skepticism goes beyond cynicism. While it may start with doubt, that’s the beginning, not the end. There isn’t any definitive or ultimate knowledge (no Truth with a capital T), but we can grind out knowledge about the world that is sufficiently reliable for us to treat it as provisionally true and act
...more
Sagan investigated what we thought in the past, and how new discoveries changed our thinking. In one episode he dedicated a segment to questioning the alleged evidence for alien visitation. That segment was a revelation. If there was ever a moment when I tipped over into becoming a skeptic, that was it. Here was a respected scientist carefully arguing with logic and evidence why the case for UFOs was not compelling. In fact it was utter crap. That meant that all those alien visitation documentaries (including half of the In Search of . . . episodes) were not just wrong, they were nonsense. I
...more
The arguments of creationists ranged from silly to sophisticated, but ultimately they were all flawed. I tried my best to understand where they went wrong. At first I naively believed that if I could just explain to creationists (when I had the opportunity, during random in-person encounters) the flaws in their reasoning or the factual errors in their premises, I could change their minds. While this isn’t impossible, it proved far more difficult than I had imagined.
The truth may be puzzling. It may take some work to grapple with. It may be counterintuitive. It may contradict deeply held prejudices. It may not be consonant with what we desperately want to be true. But our preferences do not determine what’s true. —Carl Sagan
These tools—your core concepts of scientific skepticism—can be broken down into four categories.
1. Neuropsychological humility, knowledge of all the ways in which your brain function is limited and flawed. 2. Metacognition, thinking about thinking. 3. Science. 4. Historical journeys, reviewing iconic examples of pseudoscience and deception.
That’s why “scientific” skeptics are not philosophical skeptics, professing that no knowledge is possible. We are also not cynics, who doubt as a social posture or just have a generally negative attitude about humanity. We are not contrarians who reflexively oppose all mainstream opinions. The term “skeptic” has also been hijacked by deniers who want to be viewed as genuine skeptics (asking the hard and uncomfortable questions) but are really just pursuing an agenda of denial for ideological reasons.
When countering common but false beliefs, it’s not enough to understand the relevant science, you also have to know how science goes wrong, how people form and maintain false beliefs, and how they promote those beliefs. Such expertise is generally lacking within mainstream academia, and that’s where skeptics come in.
—Science and reason can only flourish in a free society in which no ideology (religious or otherwise) is imposed upon individuals or the process of science and free inquiry.
a 2017 study by John Cook et al., confirming prior research, showed that exposing people to misinformation about the scientific consensus on global warming had a polarizing effect based on political ideology. People who already accepted the consensus did so even more, and those who rejected the consensus held more firmly to their rejection. Correcting that misinformation had almost no effect on reducing this polarization—facts were simply not enough to change people’s minds.
However, if you started by explaining to the subjects how fake experts can be used to falsely challenge the scientific consensus, the polarizing effect of this misinformation was completely neutralized. This is exactly why we choose to promote science partly by exposing pseudoscience, and not just the false information of bad science but the deceptive (sometimes self-deceptive) tactics that pseudoscientists use. It’s not enough to just teach people science, you have to teach them how science works and how to think in a valid way. This book is meant to be one giant inoculation against bad
...more
Most people have had the experience of being in a heated conversation, and then once the dust settles and everyone tries to resolve the discrepancies, it becomes clear that each person has a different memory of the conversation that just happened. You may be tempted in these situations to assume your memory is accurate and everyone else is losing their mind. Of course, that’s what everyone else is thinking about you.
“Fallibility of Perception,” what we perceive minute by minute is also an active constructive process. The various sensory streams are all filtered for the important stuff, assumptions are made about what we are likely perceiving, and everything is compared in real time to the rest and to our existing models and assumptions about the world. Preference is given to continuity, making sure everything syncs up.
In a 2010 study by Isabel Lindner et al., researchers found that simply observing another person performing an act can create false memories that we performed that act. They report: In three experiments, participants observed actions, some of which they had not performed earlier, and took a source-memory test. Action observation robustly produced false memories of self-performance relative to control conditions.
“I clearly remember . . .” stop! No, you don’t. You have a constructed memory that is likely fused, contaminated, confabulated, personalized, and distorted. And each time you recall that memory you reconstruct it, changing it further.
The act of perception is a complex, highly filtered, and active constructive process by your brain. We do not passively perceive external stimuli like a camera. This constructive process introduces many possibilities for illusion and misperception.
The bottom line is this: Your real-time perceptions are not a passive recording of the outside world. Rather, they are an active construction of your brain. This means that there is an imperfect relationship between outside reality and the model of that reality crafted by your brain.
First, only a minute fraction of information from the outside world even makes it to the portions of your brain that construct your perception. Much of it is missed because your organs of perception aren’t perfect and there are numerous trade-offs.
When your movements match your intentions, your brain creates the sensation that you control your body. If this circuit is interrupted, however, the result is something called alien hand syndrome. With this syndrome your body part seems to act on its own, without your control. The lesson here is that even the most basic components of your existence are actively constructed by your brain. Each component can be disrupted and erased.
Richard Wiseman, which can be found by searching “colour changing card trick” on YouTube. Watch it and be amazed. You can also just Google “change blindness” and be rewarded with all sorts of fun.
The technical term for the more general phenomenon of seeing patterns where they do not exist is apophenia, the tendency to see illusory patterns in noisy data. The information doesn’t even have to be sensory; the pattern can be in numbers or in events. (In this way conspiracy theories can result from apophenia—seeing a nefarious pattern in random or disconnected incidents.)
In Newfoundland they called such nighttime visitors the “old hag.” In fact the term “nightmare” derives from such beliefs—“maere” is the Old English word for a female demon who suffocates people in their sleep.
If someone is able to show me that what I think or do is not right, I will happily change, for I seek the truth, by which no one was ever truly harmed. It is the person who continues in his self-deception and ignorance who is harmed. —Marcus Aurelius
In a 2014 article discussing his now famous paper, Dunning summarized the effect: “Incompetent people do not recognize—scratch that, cannot recognize—just how incompetent they are.”
I sometimes hear the effect incorrectly described as “the more incompetent you are, the more knowledgeable you think you are.” As you can see, self-estimates do decrease with decreasing knowledge, but the gap between performance and self-assessment increases as your performance decreases.
There are several possible causes of the effect. One is simple ego—no one wants to think of themselves as below average, so they inflate their self-assessment. People also have an easier time recognizing ignorance in others than in themselves, and this will create the illusion that they are above average, even when they’re performing in the bottom 10 percent.
An ignorant mind is precisely not a spotless, empty vessel, but one that’s filled with the clutter of irrelevant or misleading life experiences, theories, facts, intuitions, strategies, algorithms, heuristics, metaphors, and hunches that regrettably have the look and feel of useful and accurate knowledge.
What I think Dunning is describing above—a conclusion with which I completely agree—are the various components of confirmation bias (see chapter 12). As we try to make sense of the world, we work with our existing knowledge and paradigms. We formulate ideas and then systematically seek out information that confirms those ideas. We dismiss contrary information as exceptions. We interpret ambiguous experiences as in line with our theories. We make subjective judgments that further reinforce our beliefs. We remember these apparent confirmations, and then our memories are tweaked over time to make
...more
In the end we are left with a powerful sense of knowledge—but it’s false knowledge. Confirmation bias leads to a high level of confidence: We feel deep in our gut that we are right. And when confronted by someone saying we’re wrong or promoting an alternate view, there is a tendency to become defensive, even hostile.
Admit it: Up to this point in the chapter, you were probably imagining yourself in the upper half of that curve and inwardly smirking at the poor rubes in the bottom half. But we are all in the bottom half some of the time. The Dunning-Kruger effect does not just apply to other people—it applies to everyone.
That’s why the world is full of incompetent, deluded people—we all are these people. This pattern, however, is just the default mode. It is not our destiny. Part of skeptical philosophy, metacognition, and critical thinking is the recognition that we all suffer from powerful and subtle cognitive biases. We have to both recognize them and make a conscious effort to work against them, realizing that this is an endless process. Part of the way to do this is to systematically doubt ourselves. We need to substitute a logical and scientific process for the one Dunning describes above.
Being thrown into a profession where your knowledge is constantly being tested and evaluated, partly because knowledge is being directly translated into specific decisions, appears to have a humbling effect (which is good). It also helps that your mentors have years or decades more experience than you—this can produce a rather stark mirror.
Think about some area in which you have a great deal of knowledge, at the expert to mastery level (or maybe a special interest in which your knowledge is above average). Now, think about how much the average person knows about your area of specialty. Not only do they know comparatively little, they likely have no idea how little they know and how much specialized knowledge even exists. Furthermore, most of what they think they know is likely wrong or misleading.
Here comes the critical part: Now realize that you are as ignorant as the average person is in every other area of knowledge in which you are not expert. The Dunning-Kruger effect is not just about dumb people not realizing how dumb they are. It is about basic human psychology and cognitive biases. Dunning-Kruger applies to everyone.
Motivated reasoning is the biased process we use to defend a position, ideology, or belief that we hold with emotional investment. Some information, some ideas, feel like our allies. We want them to win. We want to defend them. And other information or ideas are the enemy, and we want to shoot them down. —Julia Galef
Have you ever been in a heated political discussion? Heck, have you ever interacted with other human beings? Then you are likely familiar with the frustration of someone else twisting logic, cherry-picking or distorting facts, and being generally biased in their defense of a position. Of course, here’s the thing: You do it too.
the more information we have about something, and therefore the more solid our belief, the more slowly we will change that belief. We don’t just change from one thing to the next, we incorporate the new information with our old information.
The constraint on reason, though, is far greater when dealing with the special set of beliefs in which we have an emotional investment. These are alleged facts or beliefs about the world that support our sense of identity or our ideology.
We all have narratives by which we understand the world and our place in it. Some narratives are critical to our sense of identity. Preferred narratives support our worldview, our membership in a group, or our self-perception as a good and valuable person. We have narratives and beliefs that serve our basic psychological needs, such as the need for a sense of control. When those beliefs are challenged, we don’t take a rational and detached approach. We dig in our heels and engage in what is called motivated reasoning. We defend the core beliefs at all costs, shredding logic, discarding
...more
Cognitive dissonance theory was first proposed by Leon Festinger in 1957. He suggested that psychological discomfort results when we are presented with two pieces of information that conflict with each other. We hold a belief, and now we have information that contradicts that belief. Ideally, we would resolve the conflict rationally and objectively, changing the belief as necessary, depending on the nature and validity of the new information.
When the belief is strongly and emotionally held, however, it becomes too difficult to change. If the belief is at the core of our worldview, then changing it might cause a chain reaction, magnifying cognitive dissonance.
It’s emotionally easier to simply dismiss the new information, challenge its source, rationalize its implications, even ...
This highlight has been truncated due to consecutive passage length restrictions.