More on this book
Community
Kindle Notes & Highlights
by
James Gleick
Read between
October 25, 2020 - January 17, 2021
The Bell System had none of those, but the company had hired its first mathematician in 1897: George Campbell, a Minnesotan who had studied in Göttingen and Vienna.
Vannevar Bush’s Differential Analyzer, which could solve equations with great rotating gears, shafts, and wheels.
Life spreads by networking. The body itself is an information processor. Memory resides not just in brains but in every cell. No wonder genetics bloomed along with information theory. DNA is the quintessential information molecule, the most advanced message processor at the cellular level—an alphabet and a code, 6 billion bits to form a human being.
Bridging the physics of the twentieth and twenty-first centuries, John Archibald Wheeler, the last surviving collaborator of both Einstein and Bohr, put this manifesto in oracular monosyllables: “It from Bit.” Information gives rise to “every it—every particle, every field of force, even the spacetime continuum itself.”♦ This is another way of fathoming the paradox of
Wheeler wrote coyly, “arises in the last analysis from the posing of yes-no questions.” He added: “All things physical are information-theoretic in origin, and this is a participatory universe.” The whole universe is thus seen as a computer—a cosmic information-processing machine.
More capacity was required, for less extraordinary occasions. People tried flags, horns, intermitting smoke, and flashing mirrors. They conjured spirits and angels for purposes of communication—angels being divine messengers, by definition.
Redundancy—inefficient by definition—serves as the antidote to confusion. It provides second chances. Every natural language has redundancy built in; this is why people can understand text riddled with errors and why they can understand conversation in a noisy room. The natural redundancy of English motivates the famous New York City subway poster of the 1970s (and the poem by James Merrill),
“Language in fact bears the same relationship to the concept of mind that legislation bears to the concept of parliament,” says Jonathan Miller: “it is a competence forever bodying itself in a series of concrete performances.”♦
There is a progression from pictographic, writing the picture; to ideographic, writing the idea; and then logographic, writing the word.
Conclusions follow from premises. These require a degree of constancy. They have no power unless people can examine and evaluate them. In contrast, an oral narrative proceeds by accretion, the words passing by in a line of parade past the viewing stand, briefly present and then gone, interacting with one another via memory and association. There are no syllogisms in Homer.
In the ancient world, alphabetical lists scarcely appeared until around 250 BCE, in papyrus texts from Alexandria. The great library there seems to have used at least some alphabetization in organizing its books.
Not until 1613 was the first alphabetical catalogue made—not printed, but written in two small handbooks—for the Bodleian Library at Oxford.♦
Information is entropy. This was the strangest and most powerful notion of all. Entropy—already a difficult and poorly understood concept—is a measure of disorder in thermodynamics, the science of heat and energy.
Counting all the possible ways a system can be arranged, the disorderly ones far outnumber the orderly ones. There are many arrangements, or “states,” in which molecules are all jumbled, and few in which they are neatly sorted. The orderly states have low probability and low entropy.
Chomsky’s offbeat and original paper “Three Models for the Description of Language,”♦
When information is cheap, attention becomes expensive.
“HORSES AS AUTOMOBILES WITHOUT WHEELS”: Walter J. Ong, Orality and Literacy, 12.
“THERE NEVER WAS A MAN”: David Diringer and Reinhold Regensburger, The Alphabet: A Key to the History of Mankind, 3rd ed., vol. 1 (New York: Funk & Wagnalls, 1968), 166.
“WE HAVE HEARD OF ‘ENTROPIES’ ”: E. Colin Cherry, On Human Communication (Cambridge, Mass.: MIT Press, 1957), 214.