More on this book
Community
Kindle Notes & Highlights
Read between
December 28, 2020 - January 30, 2021
Memory, for Seneca as for Erasmus, was as much a crucible as a container. It was more than the sum of things remembered. It was something newly made, the essence of a unique self.
David Brooks, the popular New York Times columnist, makes a similar point. “I had thought that the magic of the information age was that it allowed us to know more,” he writes, “but then I realized the magic of the information age is that it allows us to know less. It provides us with external cognitive servants—silicon memory systems, collaborative online filters, consumer preference algorithms and networked knowledge. We can burden these servants and liberate ourselves.”
Culture is more than the aggregate of what Google describes as “the world’s information.” It’s more than what can be reduced to binary code and uploaded onto the Net. To remain vital, culture must be renewed in the minds of the members of every generation. Outsource memory, and culture withers.
“What I had not realized,” said Weizenbaum, “is that extremely short exposures to a relatively simple computer program could induce
What makes us most human, Weizenbaum had come to believe, is what is least computable about us—the connections between our mind and our body, the experiences that shape our memory and our thinking, our capacity for emotion and empathy. The great danger we face as we become more intimately involved with our computers—as we come to experience more of our lives through the disembodied symbols flickering across our screens—is that we’ll begin to lose our humanness, to sacrifice the very qualities that separate us from machines. The only way to avoid that fate, Weizenbaum wrote, is to have the
...more
WHEN A CARPENTER picks up a hammer, the hammer becomes, so far as his brain is concerned, part of his hand. When a soldier raises a pair of binoculars to his face, his brain sees through a new set of eyes, adapting instantaneously to a very different field of view. The experiments on pliers-wielding monkeys revealed how readily the plastic primate brain can incorporate tools into its sensory maps, making the artificial feel natural. In the human brain, that capacity has advanced far beyond what’s seen in even our closest primate cousins. Our ability to meld with all manner of tools is one of
...more
The typewriter makes for lucidity, but I am not sure that it encourages subtlety.”
“We shape our tools,” observed the Jesuit priest and media scholar John Culkin in 1967, “and thereafter they shape us.”
In one of the most perceptive, if least remarked, passages in Understanding Media, McLuhan wrote that our tools end up “numbing” whatever part of our body they “amplify.”20 When we extend some part of ourselves artificially, we also distance ourselves from the amplified part and its natural functions. When the power loom was invented, weavers could manufacture far more cloth during the course of a workday than they’d been able to make by hand, but they sacrificed some of their manual dexterity, not to mention some of their “feel” for fabric. Their fingers, in McLuhan’s terms, became numb.
...more
The tools of the mind amplify and in turn numb the most intimate, the most human, of our natural capacities—those for reason, perception, memory, emotion. The mechanical clock, for all the blessings it bestowed, removed us from the natural flow of time. When
In deciding when to eat, to work, to sleep, to wake up, we stopped listening to our senses and started obeying the clock. We became a lot more scientific, but we became a bit more mechanical as well.
When people came to rely on maps rather than their own bearings, they would have experienced a diminishment of the area of their hippocampus devoted to spatial representation. The numbing would have occurred deep in their neurons.
Nature isn’t our enemy, but neither is it our friend. McLuhan’s point was that an honest appraisal of any new technology, or of progress in general, requires a sensitivity to what’s lost as well as what’s gained. We shouldn’t allow the glories of technology to blind our inner watchdog to the possibility that we’ve numbed an essential part of our self.
The “chronic overactivity of those brain regions implicated in social thought” can, writes Mitchell, lead us to perceive minds where no minds exist, even in “inanimate objects.” There’s growing evidence, moreover, that our brains naturally mimic the states of the other minds we interact with, whether those minds are real or imagined. Such neural “mirroring” helps explain why we’re so quick to attribute human characteristics to our computers and computer characteristics to ourselves—why we hear a human voice when ELIZA speaks.
Our willingness, even eagerness, to enter into what Doidge calls “a single, larger system” with our data-processing devices is an outgrowth not only of the characteristics of the digital computer as an informational medium but of the characteristics of our socially adapted brains. While this cybernetic blurring of mind and machine may allow us to carry out certain cognitive tasks far more efficiently, it poses a threat to our integrity as human beings. Even as the larger system into which our minds so readily meld is lending us its powers, it is also imposing on us its limitations. To put a
...more
More information can mean less knowledge. But
The more that people depended on explicit guidance from software programs, the less engaged they were in the task and the less they ended up learning. The findings indicate, van Nimwegen concluded, that as we “externalize” problem solving and other cognitive chores to our computers, we reduce our brain’s ability “to build stable knowledge structures”—schemas, in other words—that can later “be applied in new situations.”29 A polemicist might put it more pointedly: The brighter the software, the dimmer the user.
When a ditchdigger trades his shovel for a backhoe, his arm muscles weaken even as his efficiency increases. A similar trade-off may well take place as we automate the work of the mind.
The easy way may not always be the best way, but the easy way is the way our computers and search engines encourage us to take.
A series of psychological studies over the past twenty years has revealed that after spending time in a quiet rural setting, close to nature, people exhibit greater attentiveness, stronger memory, and generally improved cognition. Their brains become both calmer and sharper. The reason, according to attention restoration theory, or ART, is that when people aren’t being bombarded by external stimuli, their brains can, in effect, relax. They no longer have to tax their working memories by processing a stream of bottom-up distractions. The resulting state of contemplativeness strengthens their
...more
Spending time in the park, the researchers found, “significantly improved” people’s performance on the cognitive tests, indicating a substantial increase in attentiveness. Walking in the city, by contrast, led to no improvement in test results.
The people who looked at pictures of nature scenes were able to exert substantially stronger control over their attention, while those who looked at city scenes showed no improvement in their attentiveness. “In sum,” concluded the researchers, “simple and brief interactions with nature can produce marked increases in cognitive control.” Spending time in the natural world seems to be of “vital importance” to “effective cognitive functioning.”34
The story revealed just how prescient Weizenbaum had been when, decades ago, he warned that as we grow more accustomed to and dependent on our computers we will be tempted to entrust to them “tasks that demand wisdom.” And once we do that, there will be no turning back. The software will become indispensable to those tasks.
In the world of 2001, people have become so machinelike that the most human character turns out to be a machine. That’s the essence of Kubrick’s dark prophecy: as we come to rely on computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial intelligence.
“people have little awareness of the frequency with which they check their phone.”
In a 2015 Florida State University study, psychologists found that when people’s phones beep or buzz while they’re in the middle of a challenging task, their focus wavers and their work gets sloppier—whether they check the phone or not.
Another 2015 study, published in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills weaken.7 A 2016 experiment by a University of Virginia psychologist and two colleagues revealed that phone notifications produce symptoms of hyperactivity and absentmindedness similar to those that afflict people with attention deficit disorders.
As the phone’s proximity increased, brainpower decreased. It was as if the smartphones had force fields that sapped their owners’ intelligence. In subsequent interviews, nearly all the students said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones muddled their thinking. A follow-up experiment, with nearly 300 participants, produced similar results. It also revealed that the more heavily the students relied on their phones in their everyday lives, the greater the cognitive penalty they
...more
Smartphones have become so tied up in our lives that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check a phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking, the authors noted. The fact that most of us now habitually keep our phones “nearby and in sight” only magnifies the toll.
It’s when we need to be smart that our phones dumb us down.
A 2016 survey of nearly a hundred high schools in Britain found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.12
In selecting targets of attention, the network gives priority to four types of stimuli: those that are novel or unexpected, those that are pleasurable or otherwise rewarding, those that are personally relevant, and those that are emotionally engaging.16 These are exactly the kinds of stimuli our smartphones supply—all the time and in abundance. Refreshing their contents continuously, our phones are fonts of new and surprising information. Our phones give us stimulation and gratification whenever we check them, triggering releases of the pleasure-producing neurotransmitter dopamine.17 Because
...more
The Internet industry may have begun in idealism, but it’s now powered by a manipulative and very lucrative feedback loop. The more we use our phones, the more data social-media companies amass on the way our minds respond to stimuli. They use that information to make their apps even more addictive. And the money rolls in.
Steve Jobs told us we’d have our lives in our pockets. He didn’t warn us about the pickpockets.
In a 2011 study, now considered a landmark in the field, a team of researchers led by Columbia psychology professor Betsy Sparrow and including the late Harvard memory expert Daniel Wegner had people read forty brief, factual statements—“the space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003” was a typical one—then type the statements into a computer. Half the participants were told that the machine would save what they typed, and the rest were told that the statements would be erased immediately. Afterward, the researchers asked the subjects to write down as many of
...more
This highlight has been truncated due to consecutive passage length restrictions.
“Creating a hard copy of an experience through media leaves only a diminished copy in our own heads.”27 With social media allowing and encouraging us to upload accounts of pretty much everything we do, this effect is now widespread. A 2017 Frontiers in Psychology survey of peer-reviewed research on how smartphones affect memory concluded that “when we turn to these devices, we generally learn and remember less from our experiences.”28
“The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” Wegner and Ward concluded, even though “they may know ever less about the world around them.” That unhappy insight probably helps explain society’s current gullibility crisis, with its attendant plague of propaganda, dogma, and venom. If your phone has blunted your powers of discernment, you’ll believe anything it tells you. And you won’t hesitate to share deceptive information with others.
“False news spreads farther, faster, deeper, and more broadly than the truth because humans, not robots, are more likely to spread it.”31 The technology we assumed would enlarge us has made us smaller.
When we constrict our capacity for reasoning and recall, or transfer those skills to a machine or a corporation, we sacrifice the ability to turn information into knowledge. We get the data but lose the meaning. Barring a cultural course correction, that may be the Internet’s most enduring legacy. Notes Introduction to the Second Edition 1.