More on this book
Community
Kindle Notes & Highlights
Read between
March 5 - April 8, 2021
Nirvana Fallacy A familiar phrase that captures the essence of the Nirvana fallacy is that “the perfect is the enemy of the good.” Essentially this style of argument starts with the premise that something is not perfect and concludes that it is therefore worthless. Vaccines don’t work 100 percent, therefore they are not useful. Science is flawed, therefore we don’t really know anything.
No True Scotsman The name of this fallacy comes from the traditional illustrative example: The argument is made that all Scotsmen are brave, and when a counterexample of a cowardly Scotsman is provided, the original claim is defended by saying, “Well then, he’s no true Scotsman.”
Reductio ad Absurdum In formal logic, the reductio ad absurdum (literally, “reduction to the absurd”) is a legitimate argument. It follows the form that if the premises are assumed to be true, logic necessarily leads to an absurd (false) conclusion, and therefore one or more premises must be false.
Slippery Slope This logical fallacy is the argument that a position is not consistent or tenable because accepting the position means that the extreme of the position must also be accepted. But moderate positions do not necessarily lead down the slippery slope to the extreme.
A tautology is an argument that utilizes circular reasoning, which means that the conclusion is also its own premise. The structure of such arguments is A = B therefore A = B, although the premise and conclusion might be formulated differently so the tautology is not immediately apparent.
Perhaps the most common example is to argue that we know the Bible is the literal word of God because the Bible says so.
Texas Sharpshooter Fallacy The name of this fallacy comes from its most common illustrative example. A target shooter claims that he can always hit the bull’s-eye. He aims at the side of a barn, shoots a hole in the wood, then goes up and draws a target around the hole he just made, giving himself a bull’s-eye. What this analogy refers to is when someone chooses the criteria for success or failure after they know the outcome. It becomes a form of post hoc reasoning, deciding that a certain piece of evidence is evidence for the conclusion you desire, but you decide that only after you know what
...more
The moving goalpost is a method of denial that involves arbitrarily moving the criteria for “proof” or acceptance out of range of whatever evidence currently exists. If new evidence comes to light meeting the prior criteria, the goalpost is pushed back further—keeping it out of range of this new evidence.
When an argument isn’t sound, it simply means the argument does not support the conclusion. The conclusion may be true or false. The fallacy fallacy comes from concluding that a claim must be wrong because one argument proposed for it is not valid.
Heuristics are mental shortcuts. They are rules of thumb that allow you to approximate a likely answer quickly, but they’re not strictly true and often result in error. Heuristics may be helpful and may lead to the correct conclusion, but they are oversimplified. A flawed heuristic may result in a cognitive bias, and a cognitive bias may adversely affect a heuristic.
When offered two equivalent choices (meaning there was no reason why the subject should prefer one choice over the other), right-handed subjects in a study were more likely to choose the option on the right, while left-handers were more likely to choose the option on the left. (If a right-handed person had an injured right hand, their choice shifted to the left option.)
This may reflect a deep aspect of our thinking called embodied cognition. We tend to understand the world in terms of physical relationships, which we then extrapolate to more abstract thinking. For example, we say that your boss is “above” you hierarchically. They’re not physically above you, but we use this spatial analogy.
Related to our risk assessment biases is the gambler’s fallacy. If you flip a fair coin heads five times in a row, most people will feel tails is now due. The bias is the thinking that past events influence future events, even when there is no causal connection. Since each coin flip is independent, it does not matter what happened before. The chance of another heads is still 50 percent.
We are also biased by our basic psychology, which caters to our emotional needs. We need to feel we are part of a group where we are liked and accepted. Thus there is an in-group bias—we are biased toward our group, making much more favorable judgments of our in-group than an out-group. Similarly, we are hugely biased toward ourselves.
If we purchase an item, our assessment of the quality of that item goes up. We’re trying to justify the purchase to ourselves.
Once we know the outcome of a situation, that knowledge colors our interpretation of what happened and why. We tend to think that whatever happened was inevitable, destined to happen, even if it was a close call.
If you can think of an instance when someone who worked at the DMV was rude, you will likely think that such behavior is typical. If you know someone who was mugged in New York City, you think muggings are common in the Big Apple. Of course, neither of these conclusions is necessarily true—your own anecdotes are just that; they are quirky and not representative.
Let’s say a patient has palpitations, headaches, sweating, and high blood pressure. These are all symptoms that are very typical of a condition known as pheochromocytoma (a tumor that secretes adrenaline). However, “pheos” are extremely rare. Therefore, even with typical symptoms, the patient is far more likely to have anxiety, or hyperthyroidism, or pretty much anything else that can cause these symptoms. Pheos are that rare.
Confirmation bias is a tendency to notice, accept, and remember information that appears to support an existing belief and to ignore, distort, explain away, or forget information that seems to contradict an existing belief. This process works undetected in the background of our minds to create the powerful illusion that the facts support our beliefs.
Relying on your subjective perception isn’t only unreliable, it’s biased by your existing beliefs. To find out if the flu vaccine really is worthwhile, you need to look at objective numbers. Each year, only about 0.0001 percent of people who get the vaccine have a severe allergic reaction (that’s one person per million doses). And there’s no way of knowing how many people contract the flu before the vaccine has reached its peak effectiveness. Influenza can incubate for up to four days, and the flu shot isn’t fully effective until two full weeks after administration.
“remembering the hits and forgetting the misses.” A psychic will throw out all sorts of statements, and the client is likely to remember only the ones that stand out for being accurate.
We may have a narrative that blue-eyed people are rude. Whenever someone is rude to you, you may look to see what color eyes they have. If they have blue eyes, this confirms your narrative and you remember it as evidence to support it. If they do not have blue eyes, you dismiss that information as an exception and promptly forget the encounter. This is often how bigotry is formed and maintained.
One of the pitfalls of learning about critical thinking and pseudoscience is that your confidence can easily turn into cockiness. It’s important to remember that knowing about cognitive biases doesn’t make you immune to them. After reading this book you won’t be magically protected from error or bias. You won’t be able to assert your opinions as if they are the authoritative Truth.
In evolutionary terms, every organism and every species is out for itself. Plants, for example, have evolved all sorts of chemicals that are designed to be poisonous to animals as a deterrent to eating them. In this sense, nature is actively trying to kill us.
The single most toxic substance known (gram for gram), botulinum toxin, is all natural. Some natural poisons, in purified and carefully measured doses, can be exploited for their physiological effects. We call such poisons “drugs.” But make no mistake: The plants in which we find these substances use them for chemical warfare against anything that would eat them.
Everyone knows that nature is not benign on some level. I don’t know any adult who would walk into their backyard or the forest and eat a random plant they couldn’t identify. Chances are very good you will get very sick, and you might even die.
what is the difference between a molecule of vitamin C that was purified from rose hips and a molecule of vitamin C that was manufactured in a lab? By definition, nothing. Atoms and molecules don’t know where they came from. Their chemical and biological properties are not dependent on their source.
From a food science perspective, it is difficult to define a food product that is “natural” because the food has probably been processed and is no longer the product of the earth. That said, FDA has not developed a definition for use of the term natural or its derivatives. However, the agency has not objected to the use of the term if the food does not contain added color, artificial flavors, or synthetic substances.
The organic food industry, for example, doesn’t allow the use of synthetic pesticides but does allow the use of so-called “natural” pesticides. In many cases the natural pesticides are less effective and more damaging to the environment than their synthetic alternatives. Some natural pesticides, like copper sulfate and rotenone, can be highly toxic. And yet they’re not viewed with the same caution, simply because they are “natural.”
We should be careful when interpreting the behavior of others. What might appear to be laziness, dishonesty, or stupidity might be better explained by situational factors of which we are ignorant. —Robert Todd Carroll
We are all very charitable to ourselves. Imagine if we were as habitually charitable to others. With an open mind and before reaching any conclusions, we can ask other people why they did or said what they did. It’s also okay to simply withhold judgment, to recognize that life is complex and we likely don’t have enough information to judge a situation.
While it does take vigilance, this bias is surprisingly simple to correct. First, recognize that you never have all the information. Next, withhold judgment and give other people the benefit of the doubt. Imagine that the other person is a character in their own movie. Find out what the plot of that movie is before you make yourself into a cliché.
It also seems extremely coincidental that such odd behavior would occur right where JFK was shot. It seems superficially reasonable to conclude that the two events are connected, that perhaps the Umbrella Man was part of a conspiracy. The real explanation is very interesting, but you could never guess it without specific knowledge. Conspiracy theorists, however, jumped on the Umbrella Man as an anomaly that could only be explained by invoking a conspiracy.
Pseudoscientists—those pretending to do science (or maybe even sincerely believing they are doing science) but who get the process profoundly wrong—use anomalies in a different way. They often engage in anomaly hunting, which is actively looking for apparent anomalies. They’re not, however, looking for clues to a deeper understanding of reality. They’re hunting in service to the pseudoscientific process of reverse engineering scientific conclusions.
The simple fact is that people do strange things for strange reasons. There’s no way to account for all possible thought processes of every person involved in an event. Often the actions of others seem unfathomable to us. Our instinct is to try to explain the behavior of others as resulting from mostly internal forces, and we tend to underestimate the influence of external factors,
Remember the lottery fallacy—conspiracy theorists tend to ask, “What are the odds of a man standing with an open umbrella right next to the president when he was shot?” Rather, they should be asking, “What are the odds of anything unusual occurring in any way associated with the JFK assassination?”
Random information, furthermore, is likely to contain patterns by chance alone. As Carl Sagan eloquently pointed out, randomness is clumpy. We tend to notice clumps. They stick out as potential patterns.
Data mining refers to the process of actively looking at large sets of data for any patterns (correlations). But since random data are clumpy, we should expect to see accidental correlations even when there is no real underlying phenomenon causing the correlation.
What are the odds of two people sharing the same birthday in a room containing twenty-three people? Often people will guess one in thirty or more. Surprisingly, it’s only about one in two. If you have seventy-five people in a room, the probability of two of them sharing a birthday is 99.9 percent.
If I am flipping a fair coin, which of the following sequences is more likely: HHTHTTTHT or HHHHHHHHH? Most people would choose the former, but the real answer is that they are equally likely. Each flip is independent, with a 50 percent chance of an H or T on every one. Matching either sequence exactly is therefore equally likely, but our intuition says that the more random-looking sequence must be more probable.
You blew it, and you blew it big! Since you seem to have difficulty grasping the basic principle at work here, I’ll explain. After the host reveals a goat, you now have a one-in-two chance of being correct. Whether you change your selection or not, the odds are the same. There is enough mathematical illiteracy in this country, and we don’t need the world’s highest IQ propagating more. Shame!—Scott Smith, Ph.D. University of Florida (“Ask Marilyn,” Parade, September 9, 1990)
The correct answer, which vos Savant gave and which caused such a hubbub, was that the best strategy was to change your choice of door once a goat has been revealed.
“There are two closed doors. One has a prize and the other a goat. That means it’s a 50/50 chance. Switching can’t improve the odds?” This kind of thinking is called the “equal probability” assumption, and it’s a common intuition that’s often incorrect (see Ruma Falk, 1992).
The door you picked clearly has a 1/3 chance of hiding the prize. That means the other two must have a 2/3 chance. If you decide to switch your door to #3, you are in essence picking both #2 and #3 at the same time, since you have the new information that a goat was behind #2. This switches your chance of winning from 1/3 to 2/3.