More on this book
Community
Kindle Notes & Highlights
Read between
November 30 - December 3, 2017
The big tech companies—the Europeans have charmingly, and correctly, lumped them together as GAFA (Google, Apple, Facebook, Amazon)—are shredding the principles that protect individuality. Their devices and sites have collapsed privacy; they disrespect the value of authorship, with their hostility to intellectual property. In the realm of economics, they justify monopoly with their well-articulated belief that competition undermines our pursuit of the common good and ambitious goals. When it comes to the most central tenet of individualism—free will—the tech companies have a different way.
OVER THE GENERATIONS, we’ve been through revolutions like this before. Back so many years ago, we delighted in the wonders of television dinners and the other newfangled foods that suddenly filled our kitchens: plastic-encased slices of cheese, oozing pizzas that emerged from a crust of ice, bags of crunchy tater tots. In the history of man, these seemed breakthrough innovations. Time-consuming tasks—shopping for ingredients; each of those tedious steps in a recipe, with their trail of collaterally crusted pots and pans—were suddenly and miraculously consigned to history. The revolution in
...more
toll. A whole new system of industrial farming emerged, with penny-conscious conglomerates cramming chickens into feces-covered pens and stuffing them full of antibiotics. By the time we came to understand the consequences of our revised patterns of consumption, the damage had been done to our waistline, longevity, soul, and planet. Something like the midcentury food revolution is now reordering the production and consumption of knowledge. Our intellectual habits are being scrambled by the dominant firms. Just as Nabisco and Kraft wanted to change how we eat and what we eat, Amazon, Facebook,
...more
The tech companies are destroying something precious, which is the possibility of contemplation. They have created a world in which we’re constantly watched and always distracted. Through their accumulation of data, they have constructed a portrait of our minds, which they use to invisibly guide mass behavior (and increasingly individual behavior) to further their financial interests.
They have eroded the integrity of institutions—media, publishing—that supply the intellectual material that provokes thought and guides democracy. Their most precious asset is our most precious asset, our attention, and they have abused it.
BEFORE THE RISE OF SILICON VALLEY, monopoly was a pejorative in the dictionary of American life.
STEWART BRAND WOULD DRIVE his truck down the San Francisco Midpeninsula, through the dissipating fog of the early sixties.
All the lights and images that Brand deployed were like LSD, an attempt to artificially induce a heightened sense of consciousness. America needed Indians, and it needed acid, too. A jolt to rouse the country from its gray flannel numbness. In time, Brand would ascribe the same mind-bending powers to computers. But before he celebrated those machines, he didn’t much like them. Everything the nascent counterculture would come to despise—the mindless submission of the herd, the tyranny of bureaucracy—could be reduced to a pungent symbol, the computer. When Brand later looked back on the sixties,
...more
By the late 1950s, IBM controlled 70 percent of the domestic computer market, with no true competitor trailing it. This monopoly was a product of savvy engineering, but also of the full support of the Pentagon and other branches of the state. (These subsidies helped the United States overtake first-rate European engineers, who enjoyed no such state support.)
Thus began Brand’s campaign of heckling NASA into releasing a color photograph of the whole Earth. He soon hitchhiked east to sell buttons on college campuses, which trumpeted his demand. This crusade, quixotic as it now sounds, helped awaken the environmental movement.
He created the Whole Earth Catalog, which was really more like an entirely new literary genre—or what Steve Jobs called “one of the bibles of my generation.”
In a way, this was a theory of radical individualism and self-reliance—a forerunner of Silicon Valley libertarianism.
It is an important fact of technological history that the outskirts of San Francisco were the national epicenter of both psychedelics and computing. Because of that geographical confluence, young engineers were unusually open to Stewart Brand’s message. That was especially true at Xerox’s famed freewheeling cauldron of creativity, its Palo Alto Research Center (PARC). One of the chief engineers there, Alan Kay, ordered every book listed in the Whole Earth Catalog and assembled them in an office library.
Two years later, when he grew this article into a book, he injected an important new phrase into the lexicon: “the personal computer.”
What he still craved was the sensation of wholeness—the profound belonging and authenticity he associated with the reservations and communes. They didn’t harbor a shred of alienation. They were at one with humanity. It was the same craving he felt when contemplating that missing photograph of Earth. This thinking was the exact opposite of Ayn Rand’s vision of libertarianism; a hunger for cooperation, sharing, and a self-conscious awareness of our place in a larger system. Brand could express this sentiment only in gusts of rhetoric that would never survive rigorous analysis, except for the
...more
The healing powers of the network could be found in McLuhan’s famous maxim: The medium is the message. Technology was the thing that mattered. He heaped blame upon Gutenberg’s invention, print, a medium he believed divided the world, isolating us from our fellow humans in the antisocial act of reading.
His critique was actually a lament—he longed for the world before print, for oral culture,
And the dream has fueled a succession of grand collaborative projects, cathedrals of knowledge built without any intention of profiting from the creation, from the virtual communities of the nineties to Linux to Wikipedia to the Creative Commons. It’s found in the very idea of open-source software. Such notions of sharing were once idealistic gestures and the reveries of shaggy inventors, but they have become so much the norm that they have been embraced by capitalism. The business plans of the most spectacularly successful firms in history, Google and Facebook, are all about wiring the world
...more
There’s always been a strange, unacknowledged convergence in the thinking of technological dreamers and the rapacious industrial monopolists of the Gilded Age. Both like to imagine escaping the rigors of competitive capitalism; both wax elegiac about the virtues of “cooperation,” which they invoke as a matter of economic necessity. There are certain systems—the telephone and telegraph are classic examples—that simply never would have flourished in a competitive market. The costs of setting up a massive network are immense. Imagine the expense of laying down all those lines crisscrossing the
...more
The most important prophet of the new monopoly is an investor named Peter Thiel. He isn’t just any investor. His successes include PayPal, Facebook, Palantir, and SpaceX,
By idolizing competition, we fail to appreciate the values of monopolies. Without having to worry about rivals, monopolies can focus on important things—they can treat their workers well, they can focus on solving important problems and generating world-changing innovation. They can “transcend the daily brute struggle for survival.”
The laws of man are a mere nuisance that can only slow down such work. Institutions and traditions are rusty scrap for the heap.
Unconventionality wasn’t just a personal style; it was a career necessity.
building machines that could simulate human thought. This subgenre of science fiction turned academic discipline goes by the name artificial intelligence (AI).
It is a testament to Carl Page’s teaching that his son went on to found the most successful, most ambitious AI company in history. Although we don’t think of Google that way, AI is precisely the source of the company’s greatness. Google uses algorithms, trained to think just like you. To accomplish this daunting task, Google must understand the intentions behind your query: When you typed “rock,” did you mean the geological feature or the musical genre or the wrestler-turned-actor? Google’s AI is so proficient that it can even supply the results for your query before you’ve finished typing it.
...more
PAGE AND BRIN ARE CREATING a brain unhindered by human bias, uninfluenced by irrational desires and dubious sensory instructions that emanate from the body.
Descartes was indeed obsessed with automata, even if he didn’t always keep one by his bed. During his life, the machine age was arriving in Europe, a subset of the great scientific revolution.
He asserted that the human casing contains a divine instrument that elevates humankind above the animal kingdom. Inside our mortal hardware, the “prison of the body,” as Descartes called it, resides the software of mind. In his theory, the mind was the place to find both the intellect and the immortal soul, the capacity for reason and man’s most godlike qualities. This was a gorgeous squaring of the circle. Descartes had somehow managed to use skepticism in service of orthodoxy; he preserved crucial shards of church doctrine—the immortal soul, for starters—while buying intellectual space for
...more
Descartes’s obsession became philosophy’s obsession. Over the centuries, mathematicians and logicians—Gottfried Leibniz, George Boole, Alfred North Whitehead—aspired to create a new system that would express thought in its purest (and therefore most divine) form. But for all the genius of these new systems, the prison of the body remained.
Philosophy couldn’t emancipate the mind, but technology just might. Google has set out to succeed where Descartes failed, except that it has jettisoned all the philosophical questions that rattled around in his head. Where Descartes emphasized skepticism and doubt, Google is never plagued by second-guessing. It has turned the liberation of the brain into an engineering challenge—an exercise that often fails to ask basic questions about the human implications of the project. This is a moral failing that afflicts Google and has haunted computer science from the start.
On a summer run in 1935, Turing lay down amid apple trees and conceived of something he called the Logical Computing Machine. His vision, recorded on paper, became the blueprint for the digital revolution.
faith. He imagined a test of the computer’s intelligence in which a person would send written questions to a human and a machine in another room. Receiving two sets of answers, the interrogator would have to guess which answers came from the human. Turing predicted that within fifty years the machine would routinely fool the questioner.
Kurzweil was the perfect engineer, confident that he could work out any puzzle put in front of him. As a newly minted graduate of MIT, he proclaimed to a friend that he wanted “to invent things so that the blind could see, and the deaf could hear, and the lame could walk.” At the age of twenty-seven, he created a machine that could read to the blind. To describe the invention hardly captures its audacity. The blind could place their book on a scanner that would then pour the text into a computer, which would then articulate the words—before Kurzweil’s machine, a flatbed scanner hadn’t existed.
...more
In Kurzweil’s telling, the singularity is when artificial intelligence becomes all-powerful, when computers are capable of designing and building other computers. This superintelligence will, of course, create a superintelligence even more powerful than itself—and so on, down the posthuman generations. At that point, all bets are off—“strong AI and nanotechnology can create any product, any situation, any environment that we can imagine at will.”
These developments will allow us to finally shed our “frail” and “limited” human bodies and brains, what he calls our “version 1.0 biological bodies.” We will fully merge with machines; our existence will become virtual; our brains will be uploaded. Thanks to his scientific reading, he can tell you the singularity will dawn in the year 2045.
The world will then change quickly: Computers will complete every basic human task, which will permit lives of leisure; pain will disappear, as will death; technology will solve the basic condition of scarcity that has always haunted life on the planet.
LARRY PAGE LIKES TO IMAGINE that he never escaped academia. Google, after all, began as a doctoral dissertation—and the inspiration for the search engine came from his connoisseurship of academic papers. As the son of a professor, he knew how researchers judge their own work. They look at the number of times it gets cited by other papers.
If there’s tension between profit and the pursuit of scientific purity, Page will make a big show of choosing the path of purity. That is, of course, a source of Google’s success over the years. Where other search engines sold higher placement in their rankings, Google never took that blatantly transactional path. It could plausibly claim that its search results were scientifically derived.
Close observers of the company understood that Google abhorred MBA types. It stubbornly resisted the creation of a marketing department. Page prided himself on hiring engineers for business-minded jobs that would traditionally go to someone trained in, say, finance. Even as Google came to employ tens of thousands of workers, Larry Page personally reviewed a file on each potential hire to make sure that the company didn’t veer too far from its engineering roots.
The best expression of the company’s idealism was its oft-mocked motto, “Don’t be evil.”
Google has spearheaded the revival of a concept first explored in the sixties, one that has failed until recently: neural networks, which involve computing modeled on the workings of the human brain. Algorithms replicate the brain’s information processing and its methods for learning.
Because DeepMind feared the dangers of a single company possessing such powerful algorithms, it insisted that Google never permit its work to be militarized or sold to intelligence services.
When the company decided to digitize every book in existence, it considered copyright law a trivial annoyance, hardly worth a moment’s hesitation. Of course, Google must have had an inkling of how its project would be perceived. That’s why it went about its mission quietly, to avoid scrutiny.
Google’s trucks would pull up to libraries and quietly walk away with boxes of books to be quickly scanned and returned. “If you don’t have a reason to talk about it, why talk about it?” Larry Page would argue, when confronted with pleas to publicly announce the existence of its program. The company’s lead lawyer on this described bluntly the roughshod attitude of his colleagues: “Google’s leadership doesn’t care terribly much about precedent or law.” In this case precedent was the centuries-old protections of intellectual property, and the consequences were a potential devastation of the
...more
“We are not scanning all those books to be read by people. We are scanning them to be read by an AI.”
Google is a company without clear boundaries, or rather, a company with ever-expanding boundaries. That’s why it’s chilling to hear Larry Page denounce competition as a wasteful concept and to hear him celebrate cooperation as the way forward. “Being negative is not how we make progress and most important things are not zero sum,” he says. “How exciting is it to come to work if the best you can do is trounce some other company that does roughly the same thing?” And it’s even more chilling to hear him contemplate how Google will someday employ more than one million people, a company twenty
...more
Facebook is always surveilling users, always auditing them, using them as lab rats in its behavioral experiments. While it creates the impression that it offers choice, Facebook paternalistically nudges users in the direction it deems best for them, which also happens to be the direction that thoroughly addicts them.
The theory holds that the sunshine of sharing our intimate details will disinfect the moral mess of our lives. Even if we don’t intend for our secrets to become public knowledge, their exposure will improve society. With the looming threat that our embarrassing information will be broadcast, we’ll behave better. And perhaps the ubiquity of incriminating photos and damning revelations will prod us to become more tolerant of one another’s sins.
“In a lot of ways Facebook is more like a government than a traditional company. We have this large community of people, and more than other technology companies we’re really setting policies.”
WITHOUT KNOWING IT, Zuckerberg is the heir to a long political tradition. Over the last two hundred years, the West has been unable to shake an abiding fantasy, a dream sequence in which we throw out the bum politicians and replace them with engineers—rule by slide rule. The French were the first to entertain this notion in the bloody, world-churning aftermath of their revolution. A coterie of the country’s most influential philosophers (notably, Henri de Saint-Simon and Auguste Comte) were genuinely torn about the course of the country. They hated all the old ancient bastions of parasitic
...more