The Skeptics' Guide to the Universe: How To Know What's Really Real in a World Increasingly Full of Fake
Rate it:
Kindle Notes & Highlights
3%
Flag icon
Science is still a messy and flawed process, but it is a process. It has, at least, the capacity for self-correction, to move our beliefs incrementally in the direction of reality.
6%
Flag icon
This research follows other research demonstrating that imagining an event is often enough to create the false memory of that event.
8%
Flag icon
We’re a bit freaked out by really good optical illusions because they force us to directly confront a reality we tend to ignore as we go through our daily lives: What we think we see is not objective; it is a process of our brains, and that process can be fooled.
8%
Flag icon
By the time we see that baseball flying at our face, it might have hit us, so our brains project movement a little bit into the future to compensate for the delay in processing time.
9%
Flag icon
Pareidolia refers to the process of perceiving an image in random noise, such as seeing a face in the craters and maria of the moon.
10%
Flag icon
The technical term for the more general phenomenon of seeing patterns where they do not exist is apophenia, the tendency to see illusory patterns in noisy data.
13%
Flag icon
self-estimates do decrease with decreasing knowledge, but the gap between performance and self-assessment increases as your performance decreases.
13%
Flag icon
As we try to make sense of the world, we work with our existing knowledge and paradigms. We formulate ideas and then systematically seek out information that confirms those ideas. We dismiss contrary information as exceptions. We interpret ambiguous experiences as in line with our theories. We make subjective judgments that further reinforce our beliefs. We remember these apparent confirmations, and then our memories are tweaked over time to make the appearance of confirmation even more dramatic.
13%
Flag icon
Think about some area in which you have a great deal of knowledge, at the expert to mastery level (or maybe a special interest in which your knowledge is above average). Now, think about how much the average person knows about your area of specialty. Not only do they know comparatively little, they likely have no idea how little they know and how much specialized knowledge even exists.
13%
Flag icon
Here comes the critical part: Now realize that you are as ignorant as the average person is in every other area of knowledge in which you are not expert.
14%
Flag icon
We have narratives and beliefs that serve our basic psychological needs, such as the need for a sense of control. When those beliefs are challenged, we don’t take a rational and detached approach. We dig in our heels and engage in what is called motivated reasoning. We defend the core beliefs at all costs, shredding logic, discarding inconvenient facts, making up facts as necessary, cherry-picking only the facts we like, engaging in magical thinking, and using subjective judgments as necessary without any consideration for internal consistency.
14%
Flag icon
When the belief is strongly and emotionally held, however, it becomes too difficult to change. If the belief is at the core of our worldview, then changing it might cause a chain reaction, magnifying cognitive dissonance. It’s emotionally easier to simply dismiss the new information, challenge its source, rationalize its implications, even invent a conspiracy to explain it.
14%
Flag icon
No scientific study is perfect, so you can always point to limitations if you want to deny the conclusions. No source is impeccable, and people make mistakes, so perhaps this is one. Different sources say different things, so you can choose to believe the one that reduces your cognitive dissonance.
16%
Flag icon
Often people will choose the assumptions that best fit the conclusion they prefer. In fact, psychological experiments show that most people start with conclusions they desire, then reverse engineer arguments to support them—a process called rationalization.
17%
Flag icon
Casually dismissing a solid scientific consensus as an “argument from authority” is a misuse of the logical fallacy. It is also an excellent example of how important context is in evaluating informal logical fallacies.
23%
Flag icon
Confirmation bias is a tendency to notice, accept, and remember information that appears to support an existing belief and to ignore, distort, explain away, or forget information that seems to contradict an existing belief.
23%
Flag icon
The narratives we use to understand the world don’t just organize information, they curate and filter it to serve those very narratives.
24%
Flag icon
There is so much information out there in the world. We encounter numerous events, people, and bits of data every day. Our brains are great at sifting this data for meaningful patterns, and when we see one, we think, “What are the odds? That cannot be a coincidence, so it confirms my belief.” Rather, the odds are very good. The possibility that you would encounter something confirming your belief was almost certain, given the number of opportunities.
25%
Flag icon
The power of confirmation bias is that it works tirelessly in the background, filtering vast amounts of data, until you have a compelling illusion that the evidence supports your belief.
25%
Flag icon
Instinctively, however, people look for evidence to confirm their hypothesis rather than evidence that can disprove it.
30%
Flag icon
First we have hyperactive pattern recognition—a tendency to see any possible patterns, erring on the side of false positives. This would serve the purpose of minimizing the chance of missing real patterns that may be important.
34%
Flag icon
If we live in a universe where every effect has a natural cause, then science should work well. We will encounter anomalies but they will eventually be explained. Science will progress. If we lived in a universe with supernatural phenomena, science would encounter persistent anomalies with which it could make no real progress.
36%
Flag icon
Science is about minimizing bias, and good experiments blind subjects and experimenters to avoid bias.
42%
Flag icon
While bias and corruption certainly exist, that doesn’t mean it is reasonable to assume that any science with which you disagree can be casually dismissed as entirely the result of such corruption. But that’s exactly what deniers do.
45%
Flag icon
The grand conspiracy forms a triangle of sorts. First there are the conspirators. This is typically a large, powerful, and shadowy organization with vast resources and control. They need to be powerful in order to fake moon landings, poison the public through jet exhaust, or frame terrorists for 9/11. Then there are the conspiracy theorists, an “Army of Light” that is able to see through the conspiracy (because they are just so clever). Finally, there is everyone else, the dupes or “sheeple” who believe the standard explanation of history and current events.
45%
Flag icon
Any evidence that can potentially falsify the conspiracy theory is just part of the conspiracy. To the conspiracy theorist, such evidence was obviously fabricated in order to maintain the conspiracy.
46%
Flag icon
While the conspirators are necessarily preternaturally large and powerful to pull off such elaborate conspiracies, they are simultaneously incredibly stupid in the ways they allegedly “expose” themselves to the conspiracy theorists. The conspirators are brilliant when the believers need them to be and careless when they need them to be.
46%
Flag icon
Once the conspiracy narrative is adopted, it becomes a lens through which reality is viewed. Pattern recognition and hyperactive agency detection combine to form a tendency to see disparate events as connected, with an unseen agent behind them.
50%
Flag icon
It has been clearly demonstrated that subjects who are being studied in a clinical trial objectively do get better. This is precisely because they are in a clinical trial—they’re paying closer attention to their overall health, likely taking better care of themselves due to the constant reminders concerning their health and habits provided by the study visits and the attention they are getting, being examined on a regular basis by a physician, and their overall compliance with treatment is higher than usual.
50%
Flag icon
In any varying system, which could be athletic performance or the waxing and waning symptoms of a chronic illness, any extreme variance is statistically likely to be followed by a more typical variance. This means that when you have bad symptoms, it’s likely that your symptoms will eventually become milder, or regress to the mean. This also means that any treatment you take when your symptoms are severe is likely to be followed by a lessening of those symptoms, creating the illusion that the treatment worked.
51%
Flag icon
If I take a treatment and then feel better, that seems like powerful evidence that the treatment worked. You cannot draw this conclusion, however. Perhaps I would have felt better anyway. Maybe I even would have felt better faster without the treatment. The only way to know is to look at many people taking the treatment versus a fake treatment (placebo) under controlled and blinded conditions and then compare all the outcomes.
53%
Flag icon
We tend to understand the world through distilled narratives, simple stories that approximate reality (whether we know it or not or whether we intend to or not). Those distilled narratives can be very useful as long as you understand they are simplified approximations and don’t confuse them with a full and complete description of reality.
54%
Flag icon
A consulting firm will come in, observe the work culture of a company, then make recommendations based upon some theory or their alleged expertise. Productivity almost always improves after such interventions. It’s possible that this corporate self-help industry is essentially selling the Hawthorne effect, even if their specific recommendations are speculative, nonsensical, or counterproductive.
54%
Flag icon
The bottom line of all this is that any intervention in almost any context will subjectively seem to work.
54%
Flag icon
For researchers, the challenge is to separate real specific effects from nonspecific artifacts of doing research or making an intervention. At this point reality is more complex than even the experts have a handle on.
59%
Flag icon
Homunculus theory is mainly a desire for simplicity, to subsume all the complexities of biology in one nice neat little system. It would be wonderful if we had a map of the whole body, somewhere we could just read in order to make detailed diagnoses, or massage or poke with needles to treat illness.
61%
Flag icon
if your theory is compatible with every possible observation, then by definition it is not falsifiable, and it is therefore not a scientific theory.
65%
Flag icon
Humans in general are great at coming up with reasons to maintain their desired beliefs in the face of contradictory data. More intelligent and educated people aren’t necessarily better at critical thinking, but they are likely to be more clever and creative in coming up with such excuses—and scientists are no exception.
65%
Flag icon
In fact, research shows there are many benefits of a negative or pessimistic outlook. Pessimism correlates with higher earnings, fewer marital problems, more effective communication, greater generosity, and less disappointment.
89%
Flag icon
Realize that you will never achieve the goal of ridding yourself of bias and error. All you can do is remain vigilant and work hard to keep them to a minimum. And while you’re keeping that light focused inward, recognize that you have sacred cows, ideas that are part of your identity and will cause you emotional pain to change.
89%
Flag icon
Being wrong will still hurt the ego. You need to get to a place where refusing to correct an error will hurt more.