More on this book
Community
Kindle Notes & Highlights
Read between
March 5 - April 8, 2021
There are organized movements with itemized lists of rationalizations and misinformation that conveniently dismiss the evidence for evolution, global warming, vaccine effectiveness, the gender pay gap, the safety of genetically modified organisms, even the sphericity of the Ea...
This highlight has been truncated due to consecutive passage length restrictions.
There are also numerous ways to interpret the implications of facts, even if the facts themselves are not in dispute.
If someone is rich, then you can conclude (if you like them) that they are successful and savvy, or (if you don’t like them) that they are corrupt and greedy. Someone can be either courageous or foolish, steadfast or stubborn, a strong leader or authoritarian. It all depends on your perspective.
Much of the psychological research on motivated reasoning involves political opinions. What multiple studies have shown is that people pursue two distinct goals when forming political opinions: a directional goal and an accuracy goal. The directional goal leads them to the opinion that is consistent with their partisan identity. The accuracy goal leads them to try to make their opinion as correct as possible.
What researchers have found is that the more partisan the individual and the issue they are evaluating, the more directional their reasoning will be. They will weigh evidence more heavily if it supports their partisan identity. They will dismiss evidence that is against the party line. Individuals are easily primed (affected subconsciously by a stimulus) as well.
People will try to adjust their beliefs to simultaneously maximize the alignment of those beliefs with facts and maximize their emotional comfort. It’s almost like a mathematical equation. The greater the emotional discomfort (cognitive dissonance) a fact presents, the harder we will work to rationalize it away.
Neuroscientists have started looking at what’s happening in the brain when subjects are confronted with facts that present a problem for their political beliefs, and when they’re offered justifications that help to resolve the problem. Drew Westen et al. used functional magnetic resonance imaging (fMRI), which images brain activity, while challenging subjects with neutral information and information problematic for their political affiliation. They found that subjects used different parts of their brains in these two situations.
When confronted with ideologically neutral information, subjects used the rational cognitive part of their brain. When confronted with partisan information, a completely different part lit up, one known to be associated with identity, sympathy, and emotion. Also interesting, after subjects arrived at their motivated conclusion, relieving the negative emotions of the conflict, the part of their brain involved in reward became activated, giving them a nice shot of dopamine.
These studies looking at the psychology and the neural correlates of motivated reasoning provide an essential insight into how the human brain behaves, and they reinforce how challenging it can be to be a consistent critical thinker. This suggests we should make a specific effort to be more detached when it comes to ideological beliefs.
Factual beliefs about the world shouldn’t be a source of identity, because those facts may be wrong, partly wrong, or incomplete.
There is also a need to remind ourselves that people who disagree with us are just people. They are not demons. They have their reasons for believing what they do. They think they’re right just as much as we think we are right. They don’t disagree with us because we’re virtuous and they are evil. They just have a different narrative than we do, one reinforced by a different set of facts and subjective judgments. This doesn’t mean that all views are equally valid. It does suggest we should strive to focus on logic and evidence, not self-serving assumptions of moral superiority.
an argument should be trying to find common ground and then proceed carefully from that common ground to resolve any differences.
The first thing we should understand about a logical argument is that it follows a certain format. There are one or more premises, which are underlying facts that the argument takes for granted or is built upon. There is then some logical connection showing how these premises necessarily lead to a specific conclusion.
If the premises of an argument are true and sufficiently complete, and the logic is valid (in which case the argument is said to be “sound”), then the conclusion must be true.
if an argument is sound, the conclusion is true. However, the converse is not true. An unsound argument can still have a conclusion that happens to be true, even though the argument doesn’t support it. Someone might argue that the sun is a sphere because spheres are pretty—that isn’t a sound argument, but the conclusion is still true.
If two people have come to different conclusions about a factual claim, then one or both must be wrong. Both cannot be correct. One or both must therefore have made an error in the arguments they used to come to their conclusions. The two parties should work together to examine their arguments and resolve any errors.
this only works if the arguments are about factual claims, not subjective feelings or value judgments. There is no objective way to resolve a dif...
This highlight has been truncated due to consecutive passage length restrictions.
It is very helpful, however, to identify when a conclusion contains an aesthetic opinion or a moral choice. It avoids arguing endlessly over an issue that is inherently irresolvable.
All too often I see people use knowledge of argument and logic to deconstruct the arguments of others. This can easily turn into an attempt to find some flaw in the arguments of those they perceive as being against them, and then declare victory.
logic and arguments should be used as a tool, not a weapon. When logic is used as a weapon, it’s far too easy to twist it subtly to suit one’s ends. It is, of course, also good to deconstruct the arguments of others. However, it is important to strive to be as fair as possible. This is called the principle of charity. Give the other the benefit of the doubt—take the best interpretation of their position possible and deconstruct that.
if a disagreement is based upon a hidden premise, then the disagreement will be irresolvable. So, when coming to an impasse in resolving differences, it’s a good idea to go back and see if there are any implied premises that haven’t been addressed.
The human brain is a marvelous machine with capabilities that, in some ways, still outperform the most powerful of supercomputers. Our brains, however, do not appear to have evolved specifically for precise logic. There are many common logical pitfalls that our minds tend to fall into unless we are consciously aware of these pitfalls and make efforts to avoid them.
In order to avoid using logical fallacies to construct unsound arguments, we need to understand how to identify fallacious logic.
if I take the premises that A = B and B = C, and I conclude that A does not equal C, that is a formal logical fallacy.
Non Sequitur From Latin, this term translates to “it doesn’t follow,” and it refers to an argument in which the conclusion does not necessarily follow from the premises. In other words, a logical connection is implied where none exists. This is the most basic type of logical fallacy, and in fact all logical fallacies are non sequiturs.
Argument from Authority The basic structure of such arguments is as follows: Professor X believes A, Professor X speaks from authority, therefore A is true. Often this argument is implied by emphasizing the many years of experience of the individual making a specific claim, or the number of formal degrees they hold. The converse of this argument is sometimes used too, that someone does not possess authority and therefore their claims must be false.
In practice this can be a complex logical fallacy to deal with. It is legitimate to consider the training and experience of an individual when examining their assessment of a particular claim. Also, a consensus of scientific opinion does carry some legitimate authority. But it is still possible for highly educated individuals and a broad consensus to be wrong—speaking from a position of authority does not make a claim necessarily true.
At the same time, saying that we should take very seriously the consensus of scientists who believe that life is the result of organic evolution would not be fallacious. That is a solid consensus (more than 98 percent of scientists) built upon a mountain of evidence, examined and argued for more than a century.
Naive evolutionary thinking can also be teleological—for example, the argument that birds evolved feathers so that they could fly, when in fact, feathers likely had to evolve prior to flight, and bird ancestors did not know they would ultimately use feathers for flying. Evolution cannot look forward. Protofeathers must have evolved for a purpose they served at the time, like insulation.
Post Hoc Ergo Propter Hoc This is perhaps the most common of logical fallacies. It follows the basic format of A preceded B, therefore A caused B, assuming cause and effect for two events just because they are temporally related (the Latin translates to “after this, therefore because of this”). This logical fallacy is frequently invoked when defending various forms of alternative medicine—I was sick, I took treatment A, I got better, therefore treatment A made me better. This is a logical fallacy because it is possible to have recovered from an illness without any treatment.
Confusing Correlation with Causation This is similar to the post hoc fallacy in that it assumes cause and effect for two variables simply because they occur together. This fallacy is often used to give a statistical correlation a causal interpretation. For example, during the 1990s both religious attendance and illegal drug use were on the rise. It would be a fallacy to conclude that therefore religious attendance causes illegal drug use. It is also possible that drug use leads to an increase in religious attendance, or that both drug use and religious attendance are increased by a third
...more
This highlight has been truncated due to consecutive passage length restrictions.
This fallacy—confusing causation and correlation—has a tendency to be abused, or applied inappropriately, to deny all statistical evidence.
Carl Sagan gave perhaps the most famous example of this fallacy in his “invisible, floating, incorporeal, heatless dragon in his garage” argument. Essentially, he claims that there is a dragon in his garage, and then invents a special reason why each test for the presence of the dragon fails.
The Latin phrase tu quoque translates to “you too.” This is an attempt to justify wrong action because someone else does the same thing: “My evidence may be bad, but so is yours.” This fallacy is frequently committed by proponents of various alternative medicine modalities, who argue that even though their therapies may lack evidence of effectiveness, more mainstream modalities also lack such evidence.
That argument, of course, doesn’t justify a treatment that lacks evidence. It is, furthermore, a false premise, as the level of evidence for mainstream therapies is often much higher than for those considered “alternative.”
But being open-minded also means being open to the possibility that a claim is wrong. It doesn’t mean assuming every claim is true or refusing to ever conclude that something is simply false. If the evidence leads to the conclusion that a claim is false or a phenomenon does not exist, then a truly open-minded person accepts that conclusion in proportion to the evidence. Open-mindedness works both ways.
Absence of evidence is, in fact, evidence of absence. It’s just not absolute proof of absence.
A false continuum is the idea that because there is no definitive demarcation line between two extremes, the distinction between the extremes is therefore not real or meaningful. For example, there is a fuzzy line between cults and religion, therefore they are really the same thing. This is like saying that no one is short or tall, that the very concepts of shortness and tallness are invalid because there is no sharp dividing line between the two. However, extremes can exist and be meaningful and clearly recognized, even when there is a fuzzy border in the middle.
This logical fallacy is often combined with a tu quoque logical fallacy. For instance, someone engaged in rank pseudoscience might argue that mainstream scientists sometimes break the rules too, by using small sample sizes, subjective outcomes, or similar bad methods. Therefore, there is no real difference between science and pseudoscience. This type of argument is closely related to another logical fallacy, the false equivalency. Extreme fraud doesn’t become okay because everyone cheats a little.
False Analogy A false or faulty analogy consists of assuming that because two or more things are similar in one way, then they are also similar in some other way, ignoring any important distinctions between the two. One might say that the protein complex that moves the flagellum of a bacterium is like a motor. Creationists then argue that this is evidence against evolution, because motors are designed and therefore the flagellum must also have been designed. They are taking a metaphor that was intended to convey one meaning and extending it to another, turning a legitimate analogy into a false
...more
Genetic Fallacy The term “genetic” here doesn’t refer to DNA or genes but to origins. This fallacy consists of arguing against something because of where it came from, rather than considering whether or not it is valid in its current form. For example, one might argue that the company Volkswagen is not a good company because it was created by Hitler. However, the company was created in 1937, and its origins eighty years ago likely have no influence on how it behaves today. Similarly, the science of astronomy evolved out of astrology, but no one would seriously argue that astronomy is not a
...more
When a 2009 study looking at the effectiveness of acupuncture for chronic back pain showed that needle location and even insertion do not affect the outcome (there was no difference between “real” acupuncture and placebo acupuncture), the authors claimed that “acupuncture” worked, it was just unclear by what mechanism.
The naturalistic fallacy usually involves moral judgments. For instance, one might refer to the behavior of animals to justify a particular human behavior as “natural.” Animals kill each other, therefore it is moral for humans to kill each other. Moral judgments, though, are distinct from what happens to exist in nature.