More on this book
Community
Kindle Notes & Highlights
Read between
May 1 - May 9, 2021
Our memories serve more to support our beliefs than to inform them. In a way, they are an evolving story we tell ourselves.
Gabriel Nita liked this
Whenever you find yourself saying, “I clearly remember . . .” stop! No, you don’t. You have a constructed memory that is likely fused, contaminated, confabulated, personalized, and distorted. And each time you recall that memory you reconstruct it, changing it further.
Hyperactive agency detection is the tendency to interpret events as if they were the deliberate intent of a conscious agent rather than the product of natural forces or unguided chaotic events.
The Dunning-Kruger effect describes the inability to evaluate one’s own competency, leading to a general tendency to overestimate one’s abilities.
The greatest enemy of knowledge is not ignorance—it is the illusion of knowledge. —Daniel J. Boorstin
An ignorant mind is precisely not a spotless, empty vessel, but one that’s filled with the clutter of irrelevant or misleading life experiences, theories, facts, intuitions, strategies, algorithms, heuristics, metaphors, and hunches that regrettably have the look and feel of useful and accurate knowledge.
Gabriel Nita liked this
As we try to make sense of the world, we work with our existing knowledge and paradigms. We formulate ideas and then systematically seek out information that confirms those ideas. We dismiss contrary information as exceptions. We interpret ambiguous experiences as in line with our theories. We make subjective judgments that further reinforce our beliefs. We remember these apparent confirmations, and then our memories are tweaked over time to make the appearance of confirmation even more dramatic.
In addition to the various aspects of critical thinking, self-assessment is a skill we can strive to develop specifically. But a good rule of thumb is to err on the side of humility. If you assume that you know relatively less than you think you do and that there is more knowledge than you are aware of, you will usually be correct.
The confidence people have in their beliefs is not a measure of the quality of evidence but of the coherence of the story the mind has managed to construct. —Daniel Kahneman
Confirmation bias is a tendency to notice, accept, and remember information that appears to support an existing belief and to ignore, distort, explain away, or forget information that seems to contradict an existing belief.
This is why confirmation bias has such a strong effect. It gives us the confident illusion that we are following the evidence. In reality, our beliefs are manufacturing the evidence. In the end we may be extremely confident in a totally false belief.
But the core lesson remains—if you want to test your hypothesis, try to prove it wrong. Do not only look for evidence to prove it right.
We should be careful when interpreting the behavior of others. What might appear to be laziness, dishonesty, or stupidity might be better explained by situational factors of which we are ignorant. —Robert Todd Carroll
With an open mind and before reaching any conclusions, we can ask other people why they did or said what they did. It’s also okay to simply withhold judgment, to recognize that life is complex and we likely don’t have enough information to judge a situation.
Often the conclusion that something is an anomaly derives from a lack of familiarity or expertise. We may, for example, be unfamiliar with conditions in exotic environments, and something may seem like an anomaly simply because you lack the specialized knowledge (scientific, technical, historical) to know the true explanation.
Often the actions of others seem unfathomable to us. Our instinct is to try to explain the behavior of others as resulting from mostly internal forces, and we tend to underestimate the influence of external factors, as we discussed in the last chapter. We also tend to assume that the actions of others are deliberate and planned rather than random or accidental.
Torture the data, and it will confess to anything. —Ronald Coase
A common ploy (often called the Jeane Dixon effect) is to make dozens of predictions knowing that the more that are made, the better the odds that one will hit. When one comes true, the psychic counts on us to conveniently forget the 99 percent that were way off, making the correct predictions seem much more compelling than they really are. This is a conscious or deliberate form of subjective validation, or, put more simply, fraud.
Methodological naturalism posits that nature is all that we can know, regardless of whether or not it’s all there is (which by definition we cannot know).
If you find a result that contradicts well-established scientific conclusions, your first thought should be that you made a mistake, not that you just overturned an entire field of science.
Pseudoscientists, often because they cannot prove their claims, frequently attempt to shift the burden of proof to those who are skeptical of those claims. They maintain that their claim must be accepted as true because it hasn’t been proven false.
Being wrong in science is useful; it still helps us move toward the answer. Being not even wrong is worthless and is by definition not scientific.
when true scientists ask a question, they want an answer and will give due consideration to any possibilities. Deniers, on the other hand, will ask the same undermining questions over and over, long after they have been definitively answered. The questions—used to cast doubt—are all they are interested in, not the process of discovery they’re meant to inspire.
Science is also not directly about truth, but rather about building testable models that predict how the universe behaves.
Some researchers have labeled this phenomenon “solution aversion” in the case of global warming: Reject the science because you don’t like the proposed solutions.
Conspiracy theories are also often arguments from ignorance. Theorists point to apparent anomalies, coincidences, or things that don’t make sense to their limited understanding, and then “just ask questions.” If you can’t explain everything down to an arbitrary level of detail, there must be a conspiracy.
The sane understand that human beings are incapable of sustaining conspiracies on a grand scale, because some of our most defining qualities as a species are inattention to detail, a tendency to panic, and an inability to keep our mouths shut.
We tend to assume that big events must have big causes. It just doesn’t sit right with us to think that a major world event, with significant historical implications, was pulled off by some lone nutjob.
beware of simplistic systems that purport to explain complex phenomena. Usually the world turns out to be more complex than we think, not simpler.
Also, if you can’t think of a possible mechanism by which something can work, then be skeptical.
An argument from ignorance—basing a conclusion on what is not known—is always a weak argument, because it doesn’t require any positive evidence for a theory: It’s just knocking down a competing theory.
It stands, however, as a good reminder that just because a belief feels right, that doesn’t make it real.
Consciousness, they argue, doesn’t have the properties of material things, therefore it is not a material thing. Therefore it is something else. Let’s call it spiritual. This may sound superficially interesting, but it is utter nonsense. They are making what we call a category mistake. They are assuming that consciousness is a thing, and they’re wrong. The brain is a thing, and it has all the properties of a thing. The mind is not a thing, it is a process. The mind is not the brain, it is what the brain does—it is the brain in action.
We can always invent clever explanations for why our hypothesis appears to fail. But we must take very seriously the straightforward possibility that our hypothesis is simply wrong.
The correspondence between reality and my beliefs comes from reality controlling my beliefs, not the other way around. —Eliezer S. Yudkowsky
It is apparently helpful to worry, at least to some extent. Excessive optimism can make us careless and set us up for failure.
It’s perfectly okay to say that you don’t have an opinion about a topic if you don’t feel you’ve done adequate research.
We shouldn’t uncritically accept the hype and spin of companies offering simple answers (that involve buying their product), but nor should we reject an entire technology based upon fear and misinformation.
Psychologists call this the “illusion of explanatory depth.” We tend to think we know how things work, even when we don’t.
It’s okay to question anything; knowledge is not absolute but is a process of discovery; there is no absolute authority, as any claim is only as good as the evidence that supports it.