More on this book
Community
Kindle Notes & Highlights
Read between
January 30 - April 25, 2021
The computer screen bulldozes our doubts with its bounties and conveniences. It is so much our servant that it would seem churlish to notice that it is also our master.
Descartes may have been wrong about dualism, but he appears to have been correct in believing that our thoughts can exert a physical influence on, or at least cause a physical reaction in, our brains. We become, neurologically, what we think.
the proliferation of public clocks changed the way people worked, shopped, played, and otherwise behaved as members of an ever more regulated society, the spread of more personal tools for tracking time—chamber clocks, pocket watches, and, a little later, wristwatches—had more intimate consequences.
The intellectual ethic is the message that a medium or other tool transmits into the minds and culture of its users.
The natural state of the human brain, like that of the brains of most of our relatives in the animal kingdom, is one of distractedness.
Eric Schmidt, Google’s chief executive, sees the incorporation of social networking into theatrical and other events as an exciting new business opportunity for Internet firms. “The most obvious use of Twitter,” he says, can be seen in situations where “everybody is watching a play and are busy talking about the play while the play is under way.”
Crovitz has fallen victim to the blindness that McLuhan warned against: the inability to see how a change in a medium’s form is also a change in its content.
Our indulgence in the pleasures of informality and immediacy has led to a narrowing of expressiveness and a loss of eloquence.
In arguing that books are archaic and dispensable, Federman and Shirky provide the intellectual cover that allows thoughtful people to slip comfortably into the permanent state of distractedness that defines the online life.
The Net commands our attention with far greater insistency than our television or radio or morning newspaper ever did. Watch a kid texting his friends or a college student looking over the roll of new messages and requests on her Facebook page or a businessman scrolling through his e-mails on his phone—or consider yourself as you enter keywords into Google’s search box and begin following a trail of links. What you see is a mind consumed with a medium.
If they stop sending messages, they risk becoming invisible.
The redirection of our mental resources, from reading words to making judgments, may be imperceptible to us—our brains are quick—but it’s been shown to impede comprehension and retention, particularly when it’s repeated frequently. As the executive functions of the prefrontal cortex kick in, our brains become not only exercised but overtaxed. In a very real way, the Web returns us to the time of scriptura continua, when reading was a cognitively strenuous act.
The mind of the experienced book reader is a calm mind, not a buzzing one.
Try reading a book while doing a crossword puzzle; that’s the intellectual environment of the Internet.
The multimedia technologies so common to the Web, the researchers concluded, “would seem to limit, rather than enhance, information acquisition.”
When carried into the realm of the intellect, the industrial ideal of efficiency poses, as Hawthorne understood, a potentially mortal threat to the pastoral ideal of meditative thought. That doesn’t mean that promoting the rapid discovery and retrieval of information is bad. It’s not. The development of a well-rounded mind requires both an ability to find and quickly parse a wide range of information and a capacity for open-ended reflection. There needs to be time for efficient data collection and time for inefficient contemplation, time to operate the machine and time to sit idly in the
...more
Still, their easy assumption that we’d all “be better off” if our brains were supplemented, or even replaced, by artificial intelligence is as unsettling as it is revealing. It underscores the firmness and the certainty with which Google holds to its Taylorist belief that intelligence is the output of a mechanical process, a series of discrete steps that can be isolated, measured, and optimized.
Such observations implied that a memory, even a strong one, remains unstable for a brief period after it’s formed. A certain amount of time seemed to be required for a primary, or short-term, memory to be transformed into a secondary, or long-term, one.
the more times an experience is repeated, the longer the memory of the experience lasts. Repetition encourages consolidation.
Biological memory is in a perpetual state of renewal.
When a person fails to consolidate a fact, an idea, or an experience in long-term memory, he’s not “freeing up” space in his brain for other functions. In contrast to working memory, with its constrained capacity, long-term memory expands and contracts with almost unlimited elasticity, thanks to the brain’s ability to grow and prune synaptic terminals and continually adjust the strength of synaptic connections.
The Web is a technology of forgetfulness.
And, thanks once again to the plasticity of our neuronal pathways, the more we use the Web, the more we train our brain to be distracted—to process information very quickly and very efficiently but without sustained attention.
Our growing dependence on the Web’s information stores may in fact be the product of a self-perpetuating, self-amplifying loop. As our use of the Web makes it harder for us to lock information into our biological memory, we’re forced to rely more and more on the Net’s capacious and easily searchable artificial memory, even if it makes us shallower thinkers.
Of all the sacrifices we make when we devote ourselves to the Internet as our universal medium, the greatest is likely to be the wealth of connections within our own minds.
Personal memory shapes and sustains the “collective memory” that underpins culture. What’s stored in the individual mind—events, facts, concepts, skills—is more than the “representation of distinctive personhood” that constitutes the self, writes the anthropologist Pascal Boyer. It’s also “the crux of cultural transmission.”41 Each of us carries and projects the history of the future. Culture is sustained in our synapses.
During the 1950s and ’60s, the enthusiasm for computers, software programming, and artificial intelligence gave rise not only to the idea that the human brain is a type of computer but to the sense that human language is the output of one of the algorithms running inside that computer.
In deciding when to eat, to work, to sleep, to wake up, we stopped listening to our senses and started obeying the clock. We became a lot more scientific, but we became a bit more mechanical as well.
Whenever we use a tool to exert greater control over the outside world, we change our relationship with that world.
Nature isn’t our enemy, but neither is it our friend. McLuhan’s point was that an honest appraisal of any new technology, or of progress in general, requires a sensitivity to what’s lost as well as what’s gained.
Before Frederick Taylor introduced his system of scientific management, the individual laborer, drawing on his training, knowledge, and experience, would make his own decisions about how he did his work.
The messiness that comes with individual autonomy was cleaned up, and the factory as a whole became more efficient, its output more predictable. Industry prospered. What was lost along with the messiness was personal initiative, creativity, and whim. Conscious craft turned into unconscious routine.