More on this book
Community
Kindle Notes & Highlights
The condition of “hyperopia”—farsightedness—was widely distributed through the population, but most people didn’t notice that they suffered from it, because they didn’t read. For a monk, straining to translate Lucretius by the flickering light of a candle, the need for spectacles was all too apparent. But the general population—the vast majority of them illiterate—had almost no occasion to discern tiny shapes like letterforms as part of their daily routine. People were farsighted; they just didn’t have any real reason to notice that they were farsighted. And so spectacles remained rare and
...more
Jano Valach liked this
What changed all of that, of course, was Gutenberg’s invention of the printing press in the 1440s.
But Gutenberg’s great breakthrough had another, less celebrated effect: it made a massive number of people aware for the first time that they were farsighted. And that revelation created a surge in demand for spectacles.
Gutenberg made printed books relatively cheap and portable, which triggered a rise in literacy, which exposed a flaw in the visual acuity of a sizable part of the population, which then created a new market for the manufacture of spectacles. Within a hundred years of Gutenberg’s invention, thousands of spectacle makers around Europe were thriving, and glasses became the first piece of advanced technology—since the invention of clothing in Neolithic times—that ordinary people would regularly wear on their bodies.
In 1590 in the small town of Middleburg in the Netherlands, father and son spectacle makers Hans and Zacharias Janssen experimented with lining up two lenses, not side by side like spectacles, but in line with each other, magnifying the objects they observed, thereby inventing the microscope.
In January of 1610, just two years after Lippershey had filed for his patent, Galileo used the telescope to observe that moons were orbiting Jupiter, the first real challenge to the Aristotelian paradigm that assumed all heavenly bodies circled the Earth.
New ways of measuring almost always imply new ways of making.
In 1970, researchers at Corning Glassworks—the Murano of modern times—developed a type of glass that was so extraordinarily clear that if you created a block of it the length of a bus, it would be just as transparent as looking through a normal windowpane. (Today, after further refinements, the block could be a half-mile long with the same clarity.) Scientists at Bell Labs then took fibers of this super-clear glass and shot laser beams down the length of them, fluctuating optical signals that corresponded to the zeroes and ones of binary code. This hybrid of two seemingly unrelated
...more
The mirror helped invent the modern self, in some real but unquantifiable way.
Once again, glass has extended our vision: not just down to the invisible world of cells and microbes, or the global connectivity of the cameraphone, but all the way back to the early days of the universe.
This was the hummingbird effect that the furnace unleashed: by learning how to generate extreme heat in a controlled environment, we unlocked the molecular potential of silicon dioxide, which soon transformed the way we see the world, and ourselves.
The history of global trade had clearly demonstrated that vast fortunes could be made by transporting a commodity that was ubiquitous in one environment to a place where it was scarce.
he took three things that the market had effectively priced at zero—ice, sawdust, and an empty vessel—and turned them into a flourishing business.
Inventions and scientific discoveries tend to come in clusters, where a handful of geographically dispersed investigators stumble independently onto the very same discovery.
The electric battery, the telegraph, the steam engine, and the digital music library were all independently invented by multiple individuals in the space of a few years.
The first great breakthrough in our obsession with the human voice arrived in the simple act of writing it down.
In March 1857, two decades before Thomas Edison would invent the phonograph, the French patent office awarded Scott a patent for a machine that recorded sound.
In the annals of invention, there may be no more curious mix of farsightedness and myopia than the story of the phonautograph. On the one hand, Scott had managed to make a critical conceptual leap—that sound waves could be pulled out of the air and etched onto a recording medium—more than a decade before other inventors and scientists got around to it. (When you’re two decades ahead of Edison, you can be pretty sure you’re doing well for yourself.) But Scott’s invention was hamstrung by one crucial—even comical—limitation. He had invented the first sound recording device in history. But he
...more
The idea that machines could convey sound waves that had originated elsewhere was not at all an intuitive one; it wasn’t until Alexander Graham Bell began reproducing sound waves at the end of a telephone that playback became an obvious leap.
Scott got to the idea of recording audio through the metaphor of stenography: write waves instead of words. That structuring metaphor enabled him to make the first leap, years ahead of his peers, but it also may have prevented him from making the second. Once words have been converted into the code of shorthand, the information captured there is decoded by a reader who understands the code. Scott thought the same would happen with his phonautograph. The machine would etch waveforms into the lampblack, each twitch of the stylus corresponding to some phoneme uttered by a human voice. And humans
...more
When Thomas Edison completed Scott’s original project and invented the phonograph in 1877, he imagined it would regularly be used as a means of sending audio letters through the postal system. Individuals would record their missives on the phonograph’s wax scrolls, and then pop them into the mail, to be played back days later. Bell, in inventing the telephone, made what was effectively a mirror-image miscalculation: He envisioned one of the primary uses for the telephone to be as a medium for sharing live music. An orchestra or singer would sit on one end of the line, and listeners would sit
...more
If you were making a phone call in the United States at any point between 1930 and 1984, you were almost without exception using AT&T’s network.
AT&T managed to keep the regulators at bay by convincing them that the phone network was a “natural monopoly” and a necessary one. Analog phone circuits were simply too complicated to be run by a hodgepodge of competing firms; if Americans wanted to have a reliable phone network, it needed to be run by a single company.
AT&T would be allowed to maintain its monopoly over phone service, but any patented invention that had originated in Bell Labs would have to be freely licensed to any American company that found it useful, and all new patents would have to be licensed for a modest fee.
Thanks to the antitrust resolution, Bell Labs became one of the strangest hybrids in the history of capitalism: a vast profit machine generating new ideas that were, for all practical purposes, socialized.
During World War II, the legendary mathematician Alan Turing and Bell Labs’ A. B. Clark collaborated on a secure communications line, code-named SIGSALY,
SIGSALY was not just a milestone in telephony. It was a watershed moment in the history of media and communications more generally: for the first time, our experiences were being digitized. The technology behind SIGSALY would continue to be useful in supplying secure lines of communication. But the truly disruptive force that it unleashed would come from another strange and wonderful property it possessed: digital copies could be perfect copies.
If the robot historians of the future had to mark one moment where the “digital age” began—the computational equivalent of the Fourth of July or Bastille Day—that transatlantic phone call in July 1943 would certainly rank high on the list.
RADIO BEGAN ITS LIFE as a two-way medium, a practice that continues to this day as ham radio:
Almost overnight, radio made jazz a national phenomenon.
There were “underground” artists before radio—impoverished poets and painters—but radio helped create a template that would become commonplace: the underground artist who becomes an overnight celebrity.
The jazz stars gave white America an example of African-Americans becoming famous and wealthy and admired for their skills as entertainers rather than advocates.
The radio signals were color-blind. Like the Internet, they didn’t break down barriers as much as live in a world separate from them.
We were no longer dependent on the reverberations of caves or cathedrals or opera houses to make our voices louder. Now electricity could do the work of echoes, but a thousand times more powerfully.
Remove the microphone and amplifier from the toolbox of twentieth-century technology and you remove one of that century’s defining forms of political organization, from Nuremberg to “I Have a Dream.”
Starting in the 1950s, guitarists playing through tube amplifiers noticed that they could make an intriguing new kind of sound by overdriving the amp: a crunchy layer of noise on top of the notes generated by strumming the strings of the guitar itself. This was, technically speaking, the sound of the amplifier malfunctioning, distorting the sound it had been designed to reproduce. To most ears it sounded like something was broken with the equipment, but a small group of musicians began to hear something appealing in the sound.
Sound engineers would go to great lengths to eliminate feedback from recordings or concert settings, positioning microphones so they didn’t pick up signal from the speakers, and thus cause the infinite-loop screech of feedback. Yet once again, one man’s malfunction turned out to be another man’s music, as artists such as Jimi Hendrix or Led Zeppelin—and later punk experimentalists like Sonic Youth—embraced the sound in their recordings and performances. In a real sense, Hendrix was not just playing the guitar on those feedback-soaked recordings in the late 1960s, he was creating a new sound
...more
Sometimes cultural innovations come from using new technologies in unexpected ways.
sometimes the innovation comes from a less likely approach: by deliberately exploiting the malfunctions, turning noise and error into a useful signal. Every genuinely new technology has a genuinely new way of breaking—and every now and then, those malfunctions open a new door in the adjacent possible.
But lighthouses perform poorly at precisely the point where they are needed the most: in stormy weather,