Logically Fallacious: The Ultimate Collection of Over 300 Logical Fallacies (Academic Edition) (Dr. Bo's Critical Thinking Series)
Rate it:
Open Preview
77%
Flag icon
Spin Doctoring (also known as: spinning) Description: Presenting information in a deceptive way that results in others interpreting the information in such a way that does not reflect reality but is how you want the information to be interpreted. Logical Form: X represents reality. Information is presented in such a way that Y appears to represent reality.
78%
Flag icon
Stolen Concept Fallacy Description: Requiring the truth of the something that you are simultaneously trying to disprove. Logical Form: Person 1 is attempting to disprove X. X is required to disprove X. Example #1: Reason and logic are not always reliable, so we should not count on it to help us find truth. Explanation: Here we are using reason to disprove the validity of reason, which is unreasonable—reasonably speaking. Example #2: Science cannot be trusted. It is a big conspiracy to cover up the truth of the Bible and the creation story. I saw fossils in the creation museum with humans and ...more
78%
Flag icon
Strawman Fallacy Description: Substituting a person’s actual position or argument with a distorted, exaggerated, or misrepresented version of the position of the argument. Logical Form: Person 1 makes claim Y. Person 2 restates person 1’s claim (in a distorted way). Person 2 attacks the distorted version of the claim. Therefore, claim Y is false.
79%
Flag icon
Get in the habit of steelmanning the argument. The opposite of the strawman is referred to as the steelman, which is a productive technique in argumentation where the one evaluating the argument makes the strongest case for the argument, assuming the best intentions of the interlocutor. This technique prevents pointless, time-wasting bickering and demonstrates respect for both the interlocutor and the process of critical argumentation.
79%
Flag icon
Style Over Substance (also known as: argument by slogan [form of], cliché thinking - or thought-terminating cliché, argument by rhyme [form of], argument by poetic language [form of]) Description: When the arguer embellishes the argument with compelling language or rhetoric, and/or visual aesthetics. This comes in many forms as described below. “If it sounds good or looks good, it must be right!” Logical Form: Person 1 makes claim Y. Claim Y sounds catchy. Therefore, claim Y is true. Example #1: A chain is only as strong as its weakest link. Explanation: Most applications of language, like the ...more
This highlight has been truncated due to consecutive passage length restrictions.
80%
Flag icon
Survivorship Fallacy (also known as: survivorship bias) Description: This is best summed up as “dead men don’t tell tales.” In its general form, the survivorship fallacy is basing a conclusion on a limited number of “winner” testimonies due to the fact we cannot or do not hear the testimonies of the losers. This is based on the cognitive bias called the survivorship bias. Logical Form: There are X winners and Y losers. We only hear the testimonies of the winners. Therefore, our conclusion is based on X winners.
80%
Flag icon
Example #2: The survivorship bias is used by scammers and con artists who take advantage of the “statistically ignorant” public. One common scam is something I call the “prophetic investor.” The scammer will send an e-mail to a very large group of people (say 10 million) with a claim that they have a perfect track record for picking winning investments. But they tell you not to take their word for it, let them prove it you by picking a stock a day for 7 days in a row that increases in value. Then, they say when you are convinced, call them and invest with them. Here’s how the scam works: Day ...more
81%
Flag icon
Tokenism Description: Interpreting a token gesture as an adequate substitute for the real thing. Logical Form: Problem X exists. Solution Y is offered. Solution Y is inadequate to solve problem X but accepted as adequate. Example #1: The presidential nominee has been accused of being racist. But he recently stated that he really liked the movie, “Roots,” so I guess he isn’t racist. Explanation: Liking one movie that exposes racism and encourages equality, is far from the same as not being a racist.
82%
Flag icon
Unfalsifiability (also known as: untestability) Description: Confidently asserting that a theory or hypothesis is true or false even though the theory or hypothesis cannot possibly be contradicted by an observation or the outcome of any physical experiment, usually without strong evidence or good reasons.
82%
Flag icon
Unreasonable Inclusion Fallacy Description: Attempting to broaden the criteria for inclusion in an ill-famed group or associated with a negative label to the point where the term's definition is changed substantially to condemn or criminalize a far less malicious or deleterious behavior. Logical Form: Person A is accused of bad behavior X. Group Y traditionally does not include individuals with bad behavior X. Person A is said to be a part of group Y for bad behavior X.   Person A is accused of bad behavior X. Label Y traditionally does not include bad behavior X. Person A is given label Y for ...more
84%
Flag icon
Example #2: I believe that when we die, we are all given new, young, perfect bodies, and we spend eternity with those whom we love. I can’t imagine the point of life if it all just ends when we die! Explanation: The fact that one doesn’t like the idea of simply not existing is not evidence for the belief. Besides, nobody seemed to mind the eternity they didn’t exist before they were born.
85%
Flag icon
Exception That Proves the Rule: Exceptions to rules are evidence against rule, never for the rules. This is a strange relic from old law that means little in argumentation today.
87%
Flag icon
Sealioning: A subtle form of trolling involving “bad-faith” questions. You disingenuously frame your conversation as a sincere request to be enlightened, placing the burden of educating you entirely on the other party. This is not a fallacy; it is more of a form of deception. As always, be careful in assuming you know the other person’s intent. On the surface, “sealioning” looks a lot like legitimate and honest Socratic inquiry.
87%
Flag icon
Weasel Wording: Using ambiguous words in order to mislead or conceal a truth: “Save up to 50% or more!” This is more of a marketing gimmick than a fallacy.
87%
Flag icon
Availability Heuristic: A mental shortcut that relies on immediate examples that come to mind when evaluating a specific topic, concept, method or decision. Example: Believing that driving across the country is safer than taking a plane because you remember seeing a horrific plane crash on the news.
88%
Flag icon
Cheerleader Effect: The bias that causes us to find individuals more attractive when they are in a group. Example: “Those guys are gorgeous!… Oh, wait. Now that they split up, I am looking at them individually and they really aren’t that good looking.”
88%
Flag icon
Conservatism: The tendency to revise one’s belief insufficiently when presented with new evidence. Example: Gert believes in the Loch Ness Monster. Gert is shown how every photo and story used as “proof” has been discredited. Gert is now less convinced, but still has an unreasonably strong belief.
88%
Flag icon
Context Effect: A concept within cognitive psychology that has to do with how environmental factors affect our perception of stimuli. Example: A joke might be hilarious at a comedy club, but the same joke at a funeral is not at all funny.
88%
Flag icon
Continued Influence Effect: This refers to the way false information enters memory and continues to affect beliefs even after the false information has been corrected. Example: Accusations were made about politician X being a rapist. All allegations were clearly found to be baseless, yet the mere idea destroyed any chances for politician X to win.
88%
Flag icon
Contrast Effect: Adding or subtracting value to subjects or objects based on how we analyze them as compared to what we perceive as a normal case. Example: People who are happy with their salary are later unhappy with their salary when they find out that their coworkers are getting paid more.
88%
Flag icon
Curse of Knowledge: Assuming that others with whom you are communicating have the same background knowledge about the topic(s) as you do. Example: Many bad teachers assume that the students already know what the teacher knows, so they lose the students in the process of teaching.
89%
Flag icon
Dunning–Kruger Effect: When people are too ignorant to realize the extent of their own ignorance. Example: Politician X thinks he can easily solve the problems we have in the Middle East. This is because he knows very little about the problems.
89%
Flag icon
Duration Neglect: The psychological observation that people’s judgments of the unpleasantness of painful experiences depend very little on the duration of those experiences. Example: If person A has his hand in ice-cold water for 10 seconds, and person B has his hand in ice-cold water for 30 seconds, both people are likely to rate the unpleasantness of the experience the same.
89%
Flag icon
Endowment Effect: The hypothesis that people ascribe more value to things merely because they own them. Example: John would never buy a trinket for $5, but if he were given the trinket, he probably wouldn’t sell it for $5 either.
89%
Flag icon
Experimenter’s or Expectation Bias: This is a research-related bias. This is the tendency for researchers (experimenters) to believe, certify, and publish data that agree with their expectations for the outcome of an experiment, and to disbelieve, discard, or downgrade the corresponding weightings for data that appear to conflict with those expectations. Example: A Christian researcher does a study on the effectiveness of prayer. She finds no effect, so she does not publish the study with the negative results.
89%
Flag icon
Fading Affect Bias: The tendency to forget information associated with negative emotions more quickly than information associated with pleasant emotions. Example: One is more likely to remember the details of their upcoming vacation and less likely to remember the details of a business trip that one is not looking forward to taking.
89%
Flag icon
False Consensus Effect: The tendency to overestimate the extent to which one’s own opinions, beliefs, preferences, values, and habits are normal and typical of those of others. Example: “Nobody likes going to the movies anymore. I have not been to the movies in years.”
89%
Flag icon
Frequency Illusion: The tendency to notice instances of a particular phenomenon once one starts to look for it, and to, therefore, believe erroneously that the phenomenon occurs more often than it does. Example: You start to look for “signs” that you should take a new job, and you start to see them everywhere. Actually, you are just interpreting common events in such a way that support your conscious or unconscious desire to take or reject the job.
89%
Flag icon
Hard–easy Effect: The tendency to be overconfident about the correctness of answers to difficult questions and underconfident about answers to easy questions. Example: This frequently happens with multiple choice questions where people consistently second guess their answers to easier questions and are more confident on the ones that they actually get wrong.
90%
Flag icon
Identifiable Victim Effect: The tendency of individuals to offer greater aid when a specific, identifiable person (or “victim”) is observed under hardship, as compared to a large, vaguely defined group with the same need. Example: “A single death is a tragedy, a million deaths is a statistic.”
90%
Flag icon
IKEA Effect: The tendency to place a disproportionately high value on products one partially creates. Example: “Do you like this table? It is my favorite piece of furniture.” “It looks like an ordinary table.” “Yea, but I assembled it!”
90%
Flag icon
Illusion of Asymmetric Insight: The tendency to perceive one’s knowledge of others to surpass other people’s knowledge of them. Example: We think we know our spouse better than he or she knows us.
90%
Flag icon
Impact Bias: The tendency for people to overestimate the length or the intensity of future feeling states. Example: Sandy thinks she would be miserable for months if she was dumped by Troy. Troy dumped Sandy. Sandy was only miserable for a couple of days and quickly got over it.
90%
Flag icon
Just-world Hypothesis: The tendency to believe one will get what one deserves that often leads to a rationalization of an inexplicable injustice by suggesting things the victim might have done to deserve it. Example: The idea that homeless people are homeless because they are lazy, uneducated, substance abusers who brought their situation upon themselves.
90%
Flag icon
Moral Luck: The tendency to ascribe moral praise or condemnation to a moral agent when they have no control of the factors that brought about the moral judgment. Example: Carl and Jason go out for a night of drinking at the local bar. They both drive home intoxicated in their separate cars. Carl gets pulled over by a cop and arrested for DUI, while Jason did not. Carl is seen as morally inferior to Jason.
91%
Flag icon
Omission Bias: The tendency to judge harmful actions as worse, or less moral than equally harmful omissions (inactions). Example: A person who passes a kid drowning and does nothing might be seen as a cold-hearted ass-clown, but not a murderer.
91%
Flag icon
Outgroup Homogeneity Bias: The tendency to perceive out-group members as more similar to one another than in-group members. Example: All people on the other team are essentially the same—mean and nasty, whereas all those on our team have all different personalities.
91%
Flag icon
Peak-end Rule: An event makes its mark in our memories more by what happens at its end than at any prior point, then at its peak. Example: Judging a really bad movie as good just because it had an exciting ending, then watching the movie a second time and being terribly disappointed for the first 98% of the movie.
91%
Flag icon
Planning Fallacy: The tendency for predictions about how much time will be needed to complete a future task and underestimate the time needed. Example: “I can be ready in 10 minutes.” In fact, it takes the person 20 minutes.
91%
Flag icon
Reactance Bias: The tendency to do something different from what someone wants you to do in reaction to a perceived attempt to constrain your freedom of choice. Example: An employee is asked by his boss to file a report by noon. He doesn’t like being his boss’ puppet, so the employee files the report by 1:00 instead.
91%
Flag icon
Reactive Devaluation: The tendency to devalue a proposal if it originates from an antagonist (i.e., some source that the person does not like). Example: A politician makes an excellent decision that will be of great benefit to the country, but because the politician is a Republican, many liberals think it is a bad decision.
91%
Flag icon
Rosy Retrospection: The tendency to remember and recollect events more favorably than when they occurred. Example: “Back in the ‘80s, people were friendly and life was grand!”
91%
Flag icon
Semmelweis Reflex: A metaphor for the reflex-like tendency to reject new evidence or new knowledge because it contradicts established norms, beliefs or paradigms. Example: Throughout the centuries, scientific facts have met much resistance until they could no longer be ignored, due to strong, conflicting religious beliefs. Even today, there are still those who refuse to accept that the earth is not flat.
92%
Flag icon
Social Comparison Bias: The tendency to dislike and compete with someone who is seen as physically, or mentally better than yourself. Example: Rod meets Carl for the first time. Carl is good-looking, well-built, and holds a PhD in physics. Rod can’t stand Carl but can’t pinpoint why.
92%
Flag icon
Status Quo Bias: The tendency to prefer the current state of affairs. Example: The idea that people generally resist change, is true due to this bias. We pass up good opportunities because we prefer the status quo.
92%
Flag icon
The Reductios: Techniques for Exposing Fallacious Reasoning There are two techniques that are well-suited for exposing fallacious reasoning and bad arguments. These are the reductio ad absurdum (reduce to absurdity) and what I call the reductio ad consequentia (reduce to the consequences). What we are essentially doing with these techniques is testing the argument presented to see if there are either any contradictions (absurd conclusions) or undesirable conclusions.
94%
Flag icon
We don’t believe in science, we trust in the scientific process. Conclusions drawn using the scientific method may change—that is the nature and the strength of the scientific process: to bring us closer to the truth as more and/or better information becomes available. Reification of science allows people to demonize science—like it is SATAN incarnate—since it is much more difficult to demonize a process.
96%
Flag icon
How can we stop an infinite regression by constantly asking “why” or “how do you know that”? The regression you mention can be very useful in questioning one’s assumptions, but it can also be fallacious. Each “how do you know” question is essentially questioning the truthfulness of the previous statement. At some point, the burden of proof shifts to the person asking “how do you know” to demonstrate that what you have claimed is false. For example: Person A: People who regularly eat donuts for breakfast are almost all obese. Person B: How do you know? (reasonable - burden is on person A to ...more
97%
Flag icon
Can it be absolutely true that there are no absolute truths? The short answer is no. The longer answer: This borders on a “Yogibearra-ism” (e.g., “Always go to other people’s funerals, otherwise they won’t come to yours.”) but known in logic as a self-defeating statement. These are good ways to make people laugh, but not good uses of reason. It is fallacious to make a self-defeating statement (again, unless as a statement of irony). Here is a non-fallacious dialog related to absolute truth: Sally: “There is no absolute truth.” Bob: “So is THAT absolutely true??” (asked rhetorically) Sally: “I ...more
97%
Flag icon
Is including the line “just food for thought” a legitimate way of making a bad argument? For example, “Just food for thought, if foods were meant to be genetically modified, they’d appear that way in nature.” This is similar to the “just playing devil’s advocate here” line. An important part of critical thinking is to consider as many possibilities as possible, so “food for thought” should be welcomed. However, it could be used as a way of just simply making a bad argument and prefacing it with “just food for thought” to get out of having to defend it. This is like saying, “No offense, but you ...more
1 3 Next »