Thinking 101: How to Reason Better to Live Better
Rate it:
Open Preview
Read between January 3 - January 13, 2024
3%
Flag icon
Illusion of Skill Acquisition
4%
Flag icon
People often fall for the illusion that they can perform a difficult feat after seeing someone else accomplish it effortlessly.
4%
Flag icon
Short presentations are actually harder to prepare for than long ones,
5%
Flag icon
Perhaps the illusion of knowledge explains why some conspiracy theories are so persistent.
6%
Flag icon
Having access to unrelated information was enough to inflate their intellectual confidence.
6%
Flag icon
we can be susceptible to cognitive biases even after we learn about them because most (or perhaps all) of them are by-products of highly adaptive mechanisms that have evolved over thousands of years to aid in our survival as a species. We can’t just turn them off.
7%
Flag icon
when we see two identical horizontal lines placed in linear perspective, our visual system assumes that the one closer to the vanishing point must be larger. In fact, line A and line B are exactly the same lengths, but our visual systems “think” that A must be longer than it is. This is called the Ponzo illusion after Mario Ponzo, the Italian psychologist who first demonstrated it.
8%
Flag icon
Trying out skills may seem like an obvious solution but surprisingly enough, not many of us actually do it. Some people think they’re trying out skills when they’re simply running the process in their heads and not using their physical muscles.
8%
Flag icon
Everything flows smoothly in your mental simulation, feeding your overconfidence.
8%
Flag icon
Trying to explain what they thought they knew was enough to make them realize how much less they knew than they’d assumed.
8%
Flag icon
Talking through possible answers to practice questions is vital because you can objectify your responses.
9%
Flag icon
That is why it is so important for society that we have conversations with people who hold different views than we do. We tend to be drawn to people who share our views. When we stay in our bubbles, we do not talk about the impacts of the policies we support, because we assume that our allies already know them. It’s only when we are forced to explain the consequences of the positions we hold to someone who does not share our views that we can begin to recognize the holes in our knowledge and the flaws in our reasoning, and work to fix them.
11%
Flag icon
My own solution, not based on any scientific evidence but from having personally experienced plenty of planning fallacies, is simple: I always add 50 percent more time to my initial estimate, like when I tell a collaborator that I can look over the manuscript in three days, even though I actually think I can do it in two. This strategy works fairly well for me.
15%
Flag icon
Peter C. Wason was a cognitive psychologist at University College London. He devised the famous 2–4–6 task in 1960, providing the first experimental demonstration of what he called confirmation bias, our tendency to confirm what we already believe.
16%
Flag icon
Thanks to the Americans with Disabilities Act, elevator doors are required to remain open long enough to allow anyone using crutches or a wheelchair to get in. According to Karen Penafiel, the executive director of the trade association National Elevator Industry, Inc., elevators’ door-close buttons do not work until that waiting time is over.
18%
Flag icon
Some people may believe genes determine our lives. Genes certainly don’t do that, because they always interact with the environment.
20%
Flag icon
What is needed to rationally test this notion is to try to falsify this hypothesis by giving women fair chances. In terms of the reasoning fallacy committed, giving only men the opportunities and concluding that men are better is no different than a child believing that monster spray works because they sprayed it in every room and haven’t seen any monsters since. We need to outgrow this fallacy.
21%
Flag icon
The number of possible chess games, even with its limited number of pieces and well-defined rules, is estimated to be 10123, which is greater than the number of atoms in the observable world. Imagine how many possible versions of our future lie ahead of us. Thus, we need to stop our searches when they are satisfying enough. Simon called this “satisficing,” a word he created by combining “satisfying” and “sacrificing.”
22%
Flag icon
Confirmation bias might be a side effect of meeting our need to satisfice, stopping our search when it’s good enough in a world that has boundless choices. Doing that can make us happier and it can also be more adaptive. Nonetheless, the problem with confirmation bias is that we continue to use it even when it is maladaptive and gives us wrong answers, as we have seen through the many examples in this chapter.
22%
Flag icon
But because confirmation bias is so entrenched, we can exploit it to overcome it. This is not as paradoxical as it sounds. The key here is to consider not just one but two mutually exclusive hypotheses and try to confirm both.
23%
Flag icon
To avoid this sort of confirmation bias, we should query ourselves to generate evidence for both possibilities.
24%
Flag icon
as it turned out, the original Mozart effect was not long-lasting, and was limited only to spatial reasoning rather than to the entire IQ; some researchers could not even replicate the original finding.
25%
Flag icon
Life is indeed full of possibilities, definitely more than the number of atoms in the observable as well as the unobservable world, and it’s up to you to discover them.
26%
Flag icon
Our causal conclusions depend on which cues we rely on more heavily.
26%
Flag icon
Similarity: We tend to treat causes and effects as similar to each other.
26%
Flag icon
Sufficiency and Necessity: We often think causes are sufficient and also necessary for an effect to occur.
26%
Flag icon
Recency: When there is a sequence of causal events, we tend to assign more blame or credit to a more recent event.
26%
Flag icon
Controllability: We are inclined to blame things that we can control rather than things that we cannot control.
27%
Flag icon
But relying on similarity to make causal inferences can lead us astray, because causes and effects are not always similar to each other.
27%
Flag icon
Though quiet typically signals that there are no problems, a toddler’s long silence can mean trouble
27%
Flag icon
The point of these examples is to remind you of the similarity heuristic’s limitations. Sometimes small causes do produce large effects.
28%
Flag icon
We engage in this sort of discounting all the time. It’s as if we believe that two causes are mutually exclusive, such that when one is present, the other is highly unlikely or couldn’t have played a role.
30%
Flag icon
We engage in similar counterfactual reasoning outside the courtroom, when we try to figure out what caused an outcome. Would B have occurred even if A hadn’t happened? Would I have missed being involved in that accident if I hadn’t gone to that store? Would they have stayed married if he hadn’t taken that job? If the outcome would have been different in our counterfactual world, we treat that factor as a cause. There is nothing irrational about using counterfactual reasoning to make causal judgments; after all, it’s what’s used in the legal system. Still, not all necessary conditions are ...more
30%
Flag icon
people’s causal attributions for the same event so often diverge; deciding what counts as normal or abnormal can vary depending on one’s perspective.
30%
Flag icon
consider gun violence. In the United States, people can legally purchase pistols, shotguns, rifles, and even semiautomatic weapons in some states. Whenever mass shootings occur, some people blame the shooters, reasoning that most gun owners don’t go out and shoot people, so there must be something abnormal about those shooters, such as their mental health, anger management ability, ideology, etc. But from a global perspective, it is clearly the United States that is abnormal. According to the Small Arms Survey, the number of civilian firearms per 100 persons in the United States was 120.5 in ...more
31%
Flag icon
We may place more blame on actions than inactions because when we are thinking about alternative possibilities, it’s easier to think about one specific action we wish we hadn’t done than to imagine all the things we might have done in cases where we did nothing.
31%
Flag icon
Inaction is not always better than a bad action; sometimes it’s equally bad.
32%
Flag icon
When we give too much credit to the most recent event, even in situations in which the order of events should not matter, we are not only ignoring the other factors that are responsible for the outcome, but depriving them of their fair share of credit or blame.
32%
Flag icon
we make causal attributions in order to guide our future actions, we typically don’t blame things that we can’t control.
34%
Flag icon
The perception of causality is an illusion.
34%
Flag icon
the kinds of why questions that are worth trying to answer are those that potentially allow us to gain insights that can guide our future actions.
34%
Flag icon
Once you stop obsessing about why certain things happened, especially things you wish hadn’t, then you can take a more distant view, which might help free you from negative emotions like remorse and regret, and also perhaps allow you to engage in more constructive problem-solving the next time you encounter a tricky situation.
36%
Flag icon
While vivid examples are a great way to communicate and convince, this chapter is about their perils. Specific examples and anecdotes can oftentimes be too powerful, leading us to violate important rational principles.
36%
Flag icon
For many people, one or two anecdotes from people they know are more persuasive than scientific evidence based on much larger samples.
36%
Flag icon
Some researchers have argued that it is because our minds are built to think in terms of what we experience and perceive rather than abstract concepts. That is, our thinking is based primarily on what we can see, touch, smell, taste, or hear.
36%
Flag icon
There are at least three key concepts that all of us need to better understand if we are to avoid making blatantly irrational judgments in everyday life. They are: the law of large numbers; regression toward the mean; and Bayes’ theorem.
42%
Flag icon
To compute the probability of A given B, or P(A|B), from the probability of B given A, or P(B|A), we need to use Bayes’ theorem,
43%
Flag icon
people also confuse the conditional probabilities. That is, based on the belief that “if there is terrorism, it’s by Muslims,” they flip it and believe that “if a person is Muslim, that person is a terrorist.” This is as nonsensical as saying that “if something is a koala, it is an animal” also means “if something is an animal, it is a koala.”
44%
Flag icon
After all, the whole point of learning is to transfer our knowledge to the new problems we will face in the future.
44%
Flag icon
if you are telling a story to make a point, your point will have a greater likelihood of being remembered if you embed it in multiple stories and tell all of them.
« Prev 1