The Skeptics' Guide to the Universe: How to Know What's Really Real in a World Increasingly Full of Fake
Rate it:
19%
Flag icon
Inconsistency It’s not valid to apply criteria or rules to one belief, claim, argument, or position but not to others.
19%
Flag icon
Naturalistic Fallacy The naturalistic fallacy refers to the is/ought problem, confusing what is true with what ought to be true. This is not to be confused with the appeal-to-nature fallacy (discussed at length in chapter 14), which posits that being natural is in itself a virtue and anything natural is inherently superior to anything artificial.
19%
Flag icon
Nirvana Fallacy A familiar phrase that captures the essence of the Nirvana fallacy is that “the perfect is the enemy of the good.” Essentially this style of argument starts with the premise that something is not perfect and concludes that it is therefore worthless. Vaccines don’t work 100 percent, therefore they are not useful. Science is flawed, therefore we don’t really know anything.
19%
Flag icon
No True Scotsman The name of this fallacy comes from the traditional illustrative example: The argument is made that all Scotsmen are brave, and when a counterexample of a cowardly Scotsman is provided, the original claim is defended by saying, “Well then, he’s no true Scotsman.”
19%
Flag icon
Reductio ad Absurdum In formal logic, the reductio ad absurdum (literally, “reduction to the absurd”) is a legitimate argument. It follows the form that if the premises are assumed to be true, logic necessarily leads to an absurd (false) conclusion, and therefore one or more premises must be false.
19%
Flag icon
Slippery Slope This logical fallacy is the argument that a position is not consistent or tenable because accepting the position means that the extreme of the position must also be accepted. But moderate positions do not necessarily lead down the slippery slope to the extreme.
20%
Flag icon
Straw Man A straw man argument is one in which you construct a weak version of someone else’s position so that it’s an easier target for you to knock down. This is the opposite of the principle of charity, in which the strongest version of an opposing position is assumed. Often positions that no one actually holds are argued against. This can make it seem as if you have vanquished any objections to your own position, or it can be used to make the opposing side seem silly or easily refuted.
20%
Flag icon
Tautology A tautology is an argument that utilizes circular reasoning, which means that the conclusion is also its own premise. The structure of such arguments is A  =  B therefore A  =  B, although the premise and conclusion might be formulated differently so the tautology is not immediately apparent.
20%
Flag icon
This fallacy is often called “begging the question,” meaning that the premise assumes the conclusion, or that an argument assumes its initial point. Perhaps the most common example is to argue that we know the Bible is the literal word of God because the Bible says so.
20%
Flag icon
Texas Sharpshooter Fallacy The name of this fallacy comes from its most common illustrative example. A target shooter claims that he can always hit the bull’s-eye. He aims at the side of a barn, shoots a hole in the wood, then goes up and draws a target around the hole he just made, giving himself a bull’s-eye. What this analogy refers to is when someone chooses the criteria for success or failure after they know the outcome. It becomes a form of post hoc reasoning, deciding that a certain piece of evidence is evidence for the conclusion you desire, but you decide that only after you know what ...more
20%
Flag icon
The Moving Goalpost The moving goalpost is a method of denial that involves arbitrarily moving the criteria for “proof” or acceptance out of range of whatever evidence currently exists. If new evidence comes to light meeting the prior criteria, the goalpost is pushed back further—keeping it out of range of this new evidence. Sometimes, impossible criteria are set up at the start—moving the goalpost impossibly out of range for the purpose of denying an undesirable conclusion.
20%
Flag icon
Fallacy Fallacy I’ve said it before, but just because an argument isn’t sound doesn’t mean the conclusion must be false. I might argue that global warming is real because the sky is blue. That’s a non sequitur, making the argument unsound. But it may still happen to be true that global warming is a real phenomenon. When an argument isn’t sound, it simply means the argument does not support the conclusion. The conclusion may be true or false. The fallacy fallacy comes from concluding that a claim must be wrong because one argument proposed for it is not valid.
21%
Flag icon
Cognitive biases are flaws in the way our brains process information. Heuristics are similar—they’re rules of thumb or mental shortcuts that are not reliably true and therefore also lead to biased thinking.
21%
Flag icon
The confidence people have in their beliefs is not a measure of the quality of evidence but of the coherence of the story the mind has managed to construct. —Daniel Kahneman
21%
Flag icon
Cognitive flaws and biases can be mitigated by metacognition, which is, simply put, thinking about thinking. Skepticism is largely a systematic effort in metacognition, which means understanding how we think and avoiding common mental pitfalls.
21%
Flag icon
Does it really matter if an item is priced at $19.99 instead of $20.00? The answer is yes, as ridiculous as that may seem when you examine it. One penny should not make a difference, but we have a leftmost digit bias—we’re mainly influenced by the leftmost digit in any number.
21%
Flag icon
Then there’s the handedness bias, which is a great example of how we often make decisions below the level of conscious awareness. When offered two equivalent choices (meaning there was no reason why the subject should prefer one choice over the other), right-handed subjects in a study were more likely to choose the option on the right, while left-handers were more likely to choose the option on the left.
21%
Flag icon
This may reflect a deep aspect of our thinking called embodied cognition. We tend to understand the world in terms of physical relationships, which we then extrapolate to more abstract thinking. For example, we say that your boss is “above” you hierarchically. They’re not physically above you, but we use this spatial analogy. An argument might be said to be “weak” or “strong,” or if it’s especially bad we might even call it “lame.” You are “blind” to reality and “deaf” to the concerns of others.
21%
Flag icon
Another important and pervasive bias is the framing bias. Your thinking about something can be dramatically affected simply by how it is presented to you.
21%
Flag icon
Far more people will opt for a procedure that has a 90 percent survival rate than one presented with a 10 percent death rate.
21%
Flag icon
Part of the reason for the difference is how we think about risk. Humans tend to be risk-averse when it comes to positive outcomes, and risk-seeking when it comes to negative outcomes.
21%
Flag icon
Related to our risk assessment biases is the gambler’s fallacy. If you flip a fair coin heads five times in a row, most people will feel tails is now due. The bias is the thinking that past events influence future events, even when there is no causal connection. Since each coin flip is independent, it does not matter what happened before.
21%
Flag icon
We are also biased by our basic psychology, which caters to our emotional needs. We need to feel we are part of a group where we are liked and accepted. Thus there is an in-group bias—we are biased toward our group, making much more favorable judgments of our in-group than an out-group. Similarly, we are hugely biased toward ourselves. We will give ourselves every benefit of the doubt and interpret our own actions in the most favorable light.
21%
Flag icon
We’re also motivated to minimize cognitive dissonance, that uncomfortable feeling we discussed in chapter 9, which occurs when facts conflict with each other or with our desires. If we purchase an item, our assessment of the quality of that item goes up. We’re trying to justify the purchase to ourselves.
21%
Flag icon
We also tend to assume that other people think like we do, a phenomenon called the projection bias. This is the way we think about and understand what other people are likely thinking. We tend to use our own mind as a template to make predictions about how other people will think and act. So if something bothers us, we assume it bothers other people as well. This is closely related to the consensus bias. We tend to assume that our own opinions are in the majority, that most other people share them.
21%
Flag icon
Hindsight bias can be a squirrely one. Once we know the outcome of a situation, that knowledge colors our interpretation of what happened and why. We tend to think that whatever happened was inevitable, destined to happen, even if it was a close call.
22%
Flag icon
Hindsight bias is similar to post hoc reasoning—once we know the outcome, we are really good at inventing reasons to “explain” it.
22%
Flag icon
Heuristics are a form of cognitive bias, but they serve a purpose and can be semi-useful. I tend to think of them as “90 percent rules.” If someone calls me on the phone with a story that ends with them needing my credit card number, I assume it’s a scam.
22%
Flag icon
The availability heuristic, for example, is the unstated assumption that if we can easily call an example of something to mind, it must therefore be common or important.
22%
Flag icon
A common heuristic with a statistical theme is the representativeness heuristic. We tend to think that someone or something likely belongs to a category if they have features typical of that category. While there is some truth to this, we overapply the rule and ignore two other critical pieces of information. First is the base rate—how common is that category? If the category is rare, then the probability of someone belonging to that category may still be low, even if they are typical for it. The second is predictive value—a feature may be typical of a category but not specific to it.
22%
Flag icon
the unit bias. It derives from a desire to simplify a complex world to make some reasonable first approximations that aren’t always accurate. We tend to focus on one salient feature of an item or thing and use that feature as the one measure of value, quality, or quantity. In a 2009 study by Andrew Geier and Paul Rozin, they asked people to estimate the weight of others. They found that people relied solely on girth, while ignoring height, which led to significant inaccuracies. The unit bias is also very common in the marketing of technology. Remember computer buying in the nineties and early ...more
22%
Flag icon
Finally, we come to the anchoring heuristic. This one is also common in marketing. Let’s say I show you a house and then ask you if you think the house is worth more or less than $100,000. I then ask you what you think the house is worth. I ask another person if they think the same house is worth more or less than $500,000, and then I ask them to guess the price. The person who was “anchored” to the $100,000 reference will guess significantly lower than the person who was anchored to the $500,000 price, even though they’re assessing the same house.
22%
Flag icon
Confirmation bias is the tendency of individuals to seek out or interpret new information as support for previously held notions or beliefs, even when such interpretations don’t hold up to statistical scrutiny.
22%
Flag icon
Ever since I first learned about confirmation bias, I’ve been seeing it everywhere. —Jon Ronson
23%
Flag icon
Confirmation bias is a tendency to notice, accept, and remember information that appears to support an existing belief and to ignore, distort, explain away, or forget information that seems to contradict an existing belief. This process works undetected in the background of our minds to create the powerful illusion that the facts support our beliefs.
1 3 Next »