The Age of Magical Overthinking: Notes on Modern Irrationality
Rate it:
Open Preview
34%
Flag icon
“It’s when we want life to feel different, but we wake up and do the same thing over and over again. We’re not planning, creating, pausing and saying, ‘I’m going to take control.’ ” There’s a coziness to
35%
Flag icon
Some say a cognitive bias is a “social illusion.” Our minds always fill in gaps to tell a story—the one at hand, a tale of space-age wonder.
36%
Flag icon
recency illusion—the tendency to assume that something is objectively new, and thus threatening, simply because it’s new to you.I Anyone
37%
Flag icon
What about the human mind convinces us that a piece of novel information is worthy of panic, and then, just as perplexing, what makes us so quickly forget and move on?
37%
Flag icon
“Why are we like this?”—and the longer I live, the more those seem to come up—the psychological explanation is often one of two things: Either the irrationality at hand carries a slightly outdated evolutionary benefit (a cognitive wisdom tooth, if you will), or it’s merely an inconvenient side effect of some other legitimately useful trait (scientists sometimes call this a “spandrel”; the human chin is a physical example).
37%
Flag icon
modern stimuli are typically more conceptual than movement in the bushes, it’s harder to know where to point our focus, what’s genuinely worthy of our distress.
37%
Flag icon
She lamented that the capitalistic pressure to “colonize the self,” to treat our bodies and minds like productivity machines, is identical to that which colonizes our time with excess news. “Those same means by which we give over our hours and days [to work] are the same with which we assault ourselves with information and misinformation, at a rate that is frankly inhumane,” wrote Odell.
38%
Flag icon
Over the millennia, our sympathetic nervous systems grew expertly good at assuming the worst.
38%
Flag icon
After all, there was no upside to “rationalizing” yourself out of overreacting to a stimulus. The stakes were too high. In noble pursuit of keeping us alive, our emotional brain has “first dibs” on interpreting incoming information. While the prefrontal cortex is well equipped to sort through complex datasets before arriving at a conclusion, it only has second dibs, and the amygdala prefers to get there via cognitive pole vault.
38%
Flag icon
Our nervous systems struggle to sustain agitation for the many crises news platforms serve us, especially when material changes don’t result right away.
38%
Flag icon
It also needs positive feedback to help us step out of survival mode,”
39%
Flag icon
A 2019 study by scientists at the Technical University of Denmark suggested that over the last century, the sheer quantity of knowable information has caused the global attention span to shrink. “It seems that the allocated attention time in our collective minds has a certain size but the cultural items competing for that attention have become more densely packed,” commented Sune Lehmann, one of the study’s authors.
39%
Flag icon
“Efficiency is doing things right; effectiveness is doing the right things,” Drucker
40%
Flag icon
eustress,
40%
Flag icon
Without memory, time doesn’t exist, and the borders bookending clockable events are the checkpoints we need to chart its passage. That’s why time felt so twisted during COVID-19 lockdown.
40%
Flag icon
This is also why childhood feels so long—because everything was brand-new. Or at least brand-new to you. The recency illusion.
40%
Flag icon
Awe is not unlike the Greek understanding of ecstasy, meaning “to stand to the side of reality,” or the flow states described by psychologist Mihaly Csikszentmihalyi. A person is “in flow” when their attention is so effortlessly consumed by an enjoyable challenge that “time disappears, you forget yourself, you feel part of something larger.” Csikszentmihalyi argued that putting more of everyday life in that “flow channel” is a key to well-being. The craft of time dilation and contraction is also a benefit of mindfulness.
41%
Flag icon
When a single word has two definitions that counter each other, that’s called a “contronym,” and there are dozens of them in English, including the word “fine” (which can either mean really nice or just adequate), “transparent” (which can either mean invisible or obvious), or the use of “bad” to mean “good” (as in, “Omg, you are baaaaad”).
41%
Flag icon
Studies of phone addiction have found that the little hits of dopamine that keep users jonesing for notifications come with a tragic side effect—they actually inhibit the amount of dopamine we feel when exposed to real-life novelty.
43%
Flag icon
people overvalue their actual skills, express excessive certainty in their evaluations, and overcredit themselves with positive outcomes. Austerely, this trifecta is labeled overconfidence bias.
43%
Flag icon
As unbecoming as bigheadedness may be, it historically had its merits. A 2011 Nature study proposed that natural selection may have favored a swollen ego, as it enhanced resolve and perseverence, made it easier to bluff opponents in conflict, and generated a self-fulfilling prophecy where self-assuredness alone fostered better chances of survival.
45%
Flag icon
cognitive biases, perhaps none is “more prevalent and more potentially catastrophic” than overconfidence.
45%
Flag icon
How much will help you succeed professionally and feel internally content but not cross over into such delirium that you risk causing damage and annoying the living daylights out of everyone? Chapman psychologist Rachel Torres studies imposter syndrome, which is sometimes framed as the opposite of the Dunning-Kruger effect.
45%
Flag icon
defines as the persistent self-perception that one is phony or incompetent, even as they’re swimming in evidence to the contrary. “I experience imposter syndrome a lot in my work, despite having years of training and all these credentials. I show up to meetings feeling like a fraud,” confessed Torres, who is not only
46%
Flag icon
Women can actually be more morally punishing than men. A 2018 study published in the Personality and Social Psychology Bulletin found that for much of history, women were viewed as less moral, due to their higher “perceived emotionality.” Having
46%
Flag icon
that women are in fact not more emotional than men overall, they do tend to display higher “self-conscious moral emotions and empathic concern.”
46%
Flag icon
These qualities caused study participants to report lower intentions to commit “morally questionable actions” that could result in personal or professional benefits but cause harm along the way, like bending rules and lying during negotiations. Most of the women surveyed also considered these acts “less permissible” and worthy of “harsher moral condemnation” than male participants.
46%
Flag icon
It seems to align with the self-serving dimension of overconfidence bias: the tendency to attribute positive outcomes to ourselves while blaming others for negative results.
47%
Flag icon
“It’s the people who almost decide to live in glass houses who throw the first stones.” You know what’s ironic? The expertise of other people is actually what allows us to think we know more than we do.
47%
Flag icon
Audre Lorde said, “Sometimes we drug ourselves with dreams of new ideas. The head will save us. The brain alone will set us free.” There are no new ideas, wrote Lorde, just new ways of making them felt: “We must constantly encourage ourselves and each other to attempt the heretical actions our dreams imply.” Sometimes, overconfident delusions give us the
47%
Flag icon
humility is defined by “a low focus on the self, an accurate (not over- or underestimated) sense of one’s accomplishments and worth, and an acknowledgment of one’s limitations, imperfections, mistakes, gaps in knowledge, and so on.”
48%
Flag icon
imposter syndrome and fewer actual imposters if we just lowered our standards a bit? Modern productivity dogma encourages us to act fast and milk our exceptionalism for all it’s worth. Under that kind of pressure, perhaps the truest rebellion is to embrace our ordinariness. In everyday life, if we could not only tolerate the discomfort but wholeheartedly embrace our own lack of expertise, then we might have a far better chance of showing others the same grace. Then perhaps life might
48%
Flag icon
With cosmic irony, research on superiority complexes has found that people with depression assess their talents more objectively than others, a symptom termed “depressive realism.” A 2013 paper published in the Proceedings of the National Academy of Sciences noted that people with weak connectivity between their brain’s frontal lobe (responsible for our sense of self) and striatum (part of the reward system) overall thought more highly of themselves than those with stronger connections between the two areas. Dopamine neurotransmitters located in the striatum inhibit connectivity to the frontal ...more
50%
Flag icon
illusory truth effect—our penchant to trust a statement as factual simply because we’ve heard it multiple times. Characterized by the power of repetition to make something false “sound true,” the illusory truth effect has been demonstrated using a bevy of stimuli, from fake news headlines and marketing claims to rumors, trivia, and internet memes.
50%
Flag icon
The illusory truth effect’s influence can be as benign as believing medieval brides stank worse than they did or as corrosive as the myth that people who use the welfare system are lazy.
50%
Flag icon
When you come across a sentiment twice and then three times, you start responding to it more quickly, your brain misinterpreting fluency as accuracy. Familiarity breeds comfort, but it also builds an immunity to unlearning and relearning, even in instances where you weren’t that attached to the knowledge in the first place—like the fake origins of wedding flowers.
50%
Flag icon
The trouble is that we don’t sort the things we learn at confidence intervals. Instead, we treat everything filed away in our minds as equally true. As McTier put it, “My brain doesn’t
50%
Flag icon
day-to-day handling of information, processing fluency (whether or not something “rings a bell”) is our default strategy for evaluating truth.
51%
Flag icon
At the risk of sounding dramatic, repetition very well might be the closest thing we have to a magical spell.
51%
Flag icon
Before the advent of writing systems and the mass distribution of books, the only way to learn anything without experiencing it firsthand was through oral repetition. Memorable chants, songs, poems, legends, allegories, jokes. The average human might not have been beset by information overload, but being underinformed wasn’t ideal, either. For most of history, a community’s elites (priests, royals, eventually the privileged scribes who were taught to read and write) guarded knowledge like precious ore. The clandestine nature of information gave these gatekeepers immense power, more than any ...more
51%
Flag icon
Technology later democratized information, and the folkloric traditions people had been using to spread wisdom for millennia took on new forms.
51%
Flag icon
legend is defined by three core qualities: It’s told as true but clearly carries undertones of doubt; its content is extremely difficult or impossible to confirm; and, not unlike a superstition, it helps us capture and cope with culture-wide fears. Legends don’t typically survive passage when they’re immediately disprovable—say, the claim that a certain harmless berry is actually deadly poisonous. Eat the berry, and you’ll know right
52%
Flag icon
the words into sound units called phonemes. This process is called “acoustic encoding,” and it’s the first step in deciphering any word. With rhyme, the comely sound structure creates a kind of blueprint, a pattern, that seems to make the message itself more sensible. Rhyming “purifies the basics” of our highly complicated world, Harvard psycholinguist Dr. Steven Pinker once said. It brings order to the informational chaos out there. Needless to say,
52%
Flag icon
The key to memory is finding a nice little place for every thought, and if a statement isn’t meaningful enough to feel orderly and thus memorable all on its own, forcing a pattern onto it, like rhyme, will do the
52%
Flag icon
Truth, however, is not always the most important goal in storytelling. Instead, it might be to reinforce cultural ideals, demonstrate you’re part of an in-group, test social norms, provoke laughterV or disgust, or laughter and disgust. Research finds that laughter and disgust are among the emotional responses most likely to make a piece of information both persuasive and shareable.VI Laughter and disgust—the
53%
Flag icon
The motto of the Enlightenment was Sapere aude, meaning “dare to know.” But Dennis-Tiwary wrote that this new, science-empowered mind that dared to know was also “a vulnerable mind, robbed of the medieval certainty of faith.” Budding ideas of self-creation clashed with life’s erratic zigzags, and this discord cracked open a geyser of untapped anxiety.
54%
Flag icon
Mistaking an anecdote for an objective fact is dubious, but using a story to breathe life into an objective fact is nothing short of magic.
54%
Flag icon
Rhyme feels extra impactful in English, as opposed to Romance and Slavic languages, like Italian or Russian, in which rhyme appears naturally all the time. English is full of inconsistent word pronunciations, spellings, and verb conjugations, which make rhyming trickier. The language’s haphazard political and geographical histories resulted in a hodgepodge of Germanic, Romance, and Celtic. English has thus been called a linguistic “pickpocket,” rummaging through the purses of nearby tongues for useful vocabulary. This disorderliness does not lend itself particularly well to organic rhyme. So, ...more
56%
Flag icon
Broadly, the bias is characterized by a universal tendency to favor information that validates our existing views and discard that which refutes them. It’s an ancient heuristic that oozes into nearly every decision a person might make, from macro-level political ideologies to minor daily character assessments (say, swiping left on a potential date because they’re a Scorpio, and we all
56%
Flag icon
cite a study from the early 2000s where participants were hooked up to MRIs and presented with data that either emboldened or negated their pre-held views on George W. Bush and John Kerry. Faced with facts they didn’t like, the reasoning areas of participants’ brains went dark, as if the prefrontal cortex stuck its fingers in its ears, shouted lalala, and left the room. By contrast, when shown corroborative information, their minds’ emotional provinces lit up brighter than my smile at the dinosaur exhibit.