The Information: A History, a Theory, a Flood
Rate it:
Open Preview
Read between January 11 - February 3, 2022
1%
Flag icon
For the purposes of science, information had to mean something special. Three centuries earlier, the new discipline of physics could not proceed until Isaac Newton appropriated words that were ancient and vague—force, mass, motion, and even time—and gave them new meanings. Newton made these terms into quantities, suitable for use in mathematical formulas. Until then, motion (for example) had been just as soft and inclusive a term as information. For Aristotelians, motion covered a far-flung family of phenomena: a peach ripening, a stone falling, a child growing, a body decaying. That was too ...more
1%
Flag icon
No wonder genetics bloomed along with information theory. DNA is the quintessential information molecule, the most advanced message processor at the cellular level—an alphabet and a code, 6 billion bits to form a human being. “What lies at the heart of every living thing is not a fire, not warm breath, not a ‘spark of life,’ ” declares the evolutionary theorist Richard Dawkins. “It is information, words, instructions.… If you want to understand life, don’t think about vibrant, throbbing gels and oozes, think about information technology.” The cells of an organism are nodes in a richly ...more
1%
Flag icon
Bridging the physics of the twentieth and twenty-first centuries, John Archibald Wheeler, the last surviving collaborator of both Einstein and Bohr, put this manifesto in oracular monosyllables: “It from Bit.” Information gives rise to “every it—every particle, every field of force, even the spacetime continuum itself.” This is another way of fathoming the paradox of the observer: that the outcome of an experiment is affected, or even determined, when it is observed. Not only is the observer observing, she is asking questions and making statements that must ultimately be expressed in discrete ...more
1%
Flag icon
When photons and electrons and other particles interact, what are they really doing? Exchanging bits, transmitting quantum states, processing information. The laws of physics are the algorithms. Every burning star, every silent nebula, every particle leaving its ghostly trace in a cloud chamber is an information processor. The universe computes its own destiny.
2%
Flag icon
Everything changed so quickly. John Robinson Pierce (the Bell Labs engineer who had come up with the word transistor) mused afterward: “It is hard to picture the world before Shannon as it seemed to those who lived in it. It is difficult to recover innocence, ignorance, and lack of understanding.”
3%
Flag icon
In the name of speed, Morse and Vail had realized that they could save strokes by reserving the shorter sequences of dots and dashes for the most common letters. But which letters would be used most often? Little was known about the alphabet’s statistics. In search of data on the letters’ relative frequencies, Vail was inspired to visit the local newspaper office in Morristown, New Jersey, and look over the type cases. He found a stock of twelve thousand E’s, nine thousand T’s, and only two hundred Z’s. He and Morse rearranged the alphabet accordingly.
4%
Flag icon
“TRY TO IMAGINE,” proposed Walter J. Ong, Jesuit priest, philosopher, and cultural historian, “a culture where no one has ever ‘looked up’ anything.” To subtract the technologies of information internalized over two millennia requires a leap of imagination backward into a forgotten past. The hardest technology to erase from our minds is the first of all: writing. This arises at the very dawn of history, as it must, because the history begins with the writing. The pastness of the past depends on it.
5%
Flag icon
Greece had not needed the alphabet to create literature—a fact that scholars realized only grudgingly, beginning in the 1930s. That was when Milman Parry, a structural linguist who studied the living tradition of oral epic poetry in Bosnia and Herzegovina, proposed that the Iliad and the Odyssey not only could have been but must have been composed and sung without benefit of writing. The meter, the formulaic redundancy, in effect the very poetry of the great works served first and foremost to aid memory. Its incantatory power made of the verse a time capsule, able to transmit a virtual ...more
5%
Flag icon
Aristotle himself, son of the physician to the king of Macedonia and an avid, organized thinker, was attempting to systematize knowledge. The persistence of writing made it possible to impose structure on what was known about the world and, then, on what was known about knowing. As soon as one could set words down, examine them, look at them anew the next day, and consider their meaning, one became a philosopher, and the philosopher began with a clean slate and a vast project of definition to undertake. Knowledge could begin to pull itself up by the bootstraps.
5%
Flag icon
Havelock focused on the process of converting, mentally, from a “prose of narrative” to a “prose of ideas”; organizing experience in terms of categories rather than events; embracing the discipline of abstraction. He had a word in mind for this process, and the word was thinking. This was the discovery, not just of the self, but of the thinking self—in effect, the true beginning of consciousness.
5%
Flag icon
Logic might be imagined to exist independent of writing—syllogisms can be spoken as well as written—but it did not. Speech is too fleeting to allow for analysis. Logic descended from the written word, in Greece as well as India and China, where it developed independently. Logic turns the act of abstraction into a tool for determining what is true and what is false: truth can be discovered in words alone, apart from concrete experience. Logic takes its form in chains: sequences whose members connect one to another. Conclusions follow from premises. These require a degree of constancy. They have ...more
This highlight has been truncated due to consecutive passage length restrictions.
7%
Flag icon
Then the vanished world of primary orality was not much missed. Not until the twentieth century, amid a burgeoning of new media for communication, did the qualms and the nostalgia resurface. Marshall McLuhan, who became the most famous spokesman for the bygone oral culture, did so in the service of an argument for modernity. He hailed the new “electric age” not for its newness but for its return to the roots of human creativity. He saw it as a revival of the old orality. “We are in our century ‘winding the tape backward,’ ” he declared, finding his metaphorical tape in one of the newest ...more
7%
Flag icon
If we are ambivalent, the ambivalence began with Plato. He witnessed writing’s rising dominion; he asserted its force and feared its lifelessness. The writer-philosopher embodied a paradox. The same paradox was destined to reappear in different guises, each technology of information bringing its own powers and its own fears. It turns out that the “forgetfulness” Plato feared does not arise. It does not arise because Plato himself, with his mentor Socrates and his disciple Aristotle, designed a vocabulary of ideas, organized them into categories, set down rules of logic, and so fulfilled the ...more
12%
Flag icon
The new technology was a watershed: “It may be here also noted that the use of a 100 pound for a day at the rate of 8, 9, 10, or the like for a yeare hath beene scarcely known, till by Logarithms it was found out: for otherwise it requires so many laborious extractions of roots, as will cost more paines than the knowledge of the thing is accompted to be worth.” Knowledge has a value and a discovery cost, each to be counted and weighed.
13%
Flag icon
Thinking about language, while thinking in language, leads to puzzles and paradoxes. Babbage tried for a while to invent, or construct, a universal language, a symbol system that would be free of local idiosyncrasies and imperfections. He was not the first to try. Leibniz himself had claimed to be on the verge of a characteristica universalis that would give humanity “a new kind of an instrument increasing the powers of reason far more than any optical instrument has ever aided the power of vision.”
18%
Flag icon
Looking back, rhapsodists found the modern age foretold in a verse from the book of Job: “Canst thou send lightnings, that they may go and say unto thee, Here we are?” But lightning did not say anything—it dazzled, cracked, and burned, but to convey a message would require some ingenuity. In human hands, electricity could hardly accomplish anything, at first. It could not make a light brighter than a spark. It was silent. But it could be sent along wires to great distances—this was discovered early—and it seemed to turn wires into faint magnets. Those wires could be long: no one had found any ...more
This highlight has been truncated due to consecutive passage length restrictions.
20%
Flag icon
Educated at Yale College, the son of a Massachusetts preacher, Morse was an artist, not a scientist. In the 1820s and 1830s he spent much of his time traveling in England, France, Switzerland, and Italy to study painting. It was on one of these trips that he first heard about electric telegraphy or, in the terms of his memoirs, had his sudden insight: “like a flash of the subtle fluid which afterwards became his servant,” as his son put it. Morse told a friend who was rooming with him in Paris: “The mails in our country are too slow; this French telegraph is better, and would do even better in ...more
This highlight has been truncated due to consecutive passage length restrictions.
21%
Flag icon
Information that just two years earlier had taken days to arrive at its destination could now be there—anywhere—in seconds. This was not a doubling or tripling of transmission speed; it was a leap of many orders of magnitude. It was like the bursting of a dam whose presence had not even been known.
22%
Flag icon
In this time of conceptual change, mental readjustments were needed to understand the telegraph itself. Confusion inspired anecdotes, which often turned on awkward new meanings of familiar terms: innocent words like send, and heavily laden ones, like message. There was the woman who brought a dish of sauerkraut into the telegraph office in Karlsruhe to be “sent” to her son in Rastatt. She had heard of soldiers being “sent” to the front by telegraph. There was the man who brought a “message” into the telegraph office in Bangor, Maine. The operator manipulated the telegraph key and then placed ...more
This highlight has been truncated due to consecutive passage length restrictions.
24%
Flag icon
Morgan was Babbage’s friend and Ada Byron’s tutor and a professor at University College, London. Boole was the son of a Lincolnshire cobbler and a lady’s maid and became, by the 1840s, a professor at Queen’s College, Cork. In 1847 they published separately and simultaneously books that amounted to the greatest milestone in the development of logic since Aristotle: Boole’s Mathematical Analysis of Logic, Being an Essay Towards a Calculus of Deductive Reasoning, and De Morgan’s Formal Logic: or, the Calculus of Inference, Necessary and Probable. The subject, esoteric as it was, had stagnated for ...more
24%
Flag icon
Boole thought of his system as a mathematics without numbers. “It is simply a fact,” he wrote, “that the ultimate laws of logic—those alone on which it is possible to construct a science of logic—are mathematical in their form and expression, although not belonging to the mathematics of quantity.” The only numbers allowed, he proposed, were zero and one. It was all or nothing: “The respective interpretation of the symbols 0 and 1 in the system of logic are Nothing and Universe.” Until now logic had belonged to philosophy. Boole was claiming possession on behalf of mathematics.
28%
Flag icon
To enable the fast expansion of this extraordinary network, the telephone demanded new technologies and new science. They were broadly of two kinds. One had to do with electricity itself: measuring electrical quantities; controlling the electromagnetic wave, as it was now understood—its modulation in amplitude and in frequency. Maxwell had established in the 1860s that electrical pulses and magnetism and light itself were all manifestations of a single force: “affectations of the same substance,” light being one more case of “an electromagnetic disturbance propagated through the field ...more
This highlight has been truncated due to consecutive passage length restrictions.
30%
Flag icon
Turing, like Babbage, meant his machine to compute numbers, but he had no need to worry about the limitations of iron and brass. Turing did not plan ever to build his machine.
30%
Flag icon
Few could follow it. It seems paradoxical—it is paradoxical—but Turing proved that some numbers are uncomputable. (In fact, most are.) Also, because every number corresponds to an encoded proposition of mathematics and logic, Turing had resolved Hilbert’s question about whether every proposition is decidable. He had proved that the Entscheidungsproblem has an answer, and the answer is no. An uncomputable number is, in effect, an undecidable proposition. So Turing’s computer—a fanciful, abstract, wholly imaginary machine—led him to a proof parallel to Gödel’s. Turing went further than Gödel by ...more
This highlight has been truncated due to consecutive passage length restrictions.
42%
Flag icon
The replication of DNA is a copying of information. The manufacture of proteins is a transfer of information: the sending of a message. Biologists could see this clearly now, because the message was now well defined and abstracted from any particular substrate. If messages could be borne upon sound waves or electrical pulses, why not by chemical processes? Gamow framed the issue simply: “The nucleus of a living cell is a storehouse of information.” Furthermore, he said, it is a transmitter of information. The continuity of all life stems from this “information system”; the proper study of ...more
43%
Flag icon
Genes, not organisms, are the true units of natural selection. They began as “replicators”—molecules formed accidentally in the primordial soup, with the unusual property of making copies of themselves. They are past masters of the survival arts. But do not look for them floating loose in the sea; they gave up that cavalier freedom long ago. Now they swarm in huge colonies, safe inside gigantic lumbering robots, sealed off from the outside world, communicating with it by tortuous indirect routes, manipulating it by remote control. They are in you and in me; they created us, body and mind; and ...more
44%
Flag icon
Richard Dawkins made his own connection between the evolution of genes and the evolution of ideas. His essential actor was the replicator, and it scarcely mattered whether replicators were made of nucleic acid. His rule is “All life evolves by the differential survival of replicating entities.” Wherever there is life, there must be replicators. Perhaps on other worlds replicators could arise in a silicon-based chemistry—or in no chemistry at all. What would it mean for a replicator to exist without chemistry? “I think that a new kind of replicator has recently emerged on this planet,” he ...more
45%
Flag icon
“To die for an idea; it is unquestionably noble,” H. L. Mencken wrote. “But how much nobler it would be if men died for ideas that were true!”
47%
Flag icon
But why do we say Π is not random? Chaitin proposed a clear answer: a number is not random if it is computable—if a definable computer program will generate it. Thus computability is a measure of randomness. For Turing computability was a yes-or-no quality—a given number either is or is not. But we would like to say that some numbers are more random than others—they are less patterned, less orderly. Chaitin said the patterns and the order express computability. Algorithms generate patterns. So we can gauge computability by looking at the size of the algorithm. Given a number—represented as a ...more
48%
Flag icon
But if the scientist could discover a way to produce the same sequence with an algorithm, a computer program significantly shorter than the sequence, then he would surely know the events were not random. He would say that he had hit upon a theory. This is what science always seeks: a simple theory that accounts for a large set of facts and allows for prediction of events still to come. It is the famous Occam’s razor. “We are to admit no more causes of natural things than such as are both true and sufficient to explain their appearances,” said Newton, “for nature is pleased with simplicity.” ...more
48%
Flag icon
Kolmogorov introduced a new word for the thing he was trying to measure: complexity. As he defined this term, the complexity of a number, or message, or set of data is the inverse of simplicity and order and, once again, it corresponds to information. The simpler an object is, the less information it conveys. The more complexity, the more information. And, just as Gregory Chaitin did, Kolmogorov put this idea on a solid mathematical footing by calculating complexity in terms of algorithms. The complexity of an object is the size of the smallest computer program needed to generate it. An object ...more
48%
Flag icon
The Kolmogorov complexity of an object is the size, in bits, of the shortest algorithm needed to generate it. This is also the amount of information. And it is also the degree of randomness—Kolmogorov declared “a new conception of the notion ‘random’ corresponding to the natural assumption that randomness is the absence of regularity.” The three are fundamentally equivalent: information, randomness, and complexity—three powerful abstractions, bound all along like secret lovers.
50%
Flag icon
In a way, then, the use of minimal program size to define complexity seems perfect—a fitting apogee for Shannon information theory. In another way it remains deeply unsatisfying. This is particularly so when turning to the big questions—one might say, the human questions—of art, of biology, of intelligence. According to this measure, a million zeroes and a million coin tosses lie at opposite ends of the spectrum. The empty string is as simple as can be; the random string is maximally complex. The zeroes convey no information; coin tosses produce the most information possible. Yet these ...more
This highlight has been truncated due to consecutive passage length restrictions.
51%
Flag icon
The more energy, the faster the bits flip. Earth, air, fire, and water in the end are all made of energy, but the different forms they take are determined by information. To do anything requires energy. To specify what is done requires information. —Seth Lloyd (2006)
52%
Flag icon
Landauer tried in 1961 to prove von Neumann’s formula for the cost of information processing and discovered that he could not. On the contrary, it seemed that most logical operations have no entropy cost at all. When a bit flips from zero to one, or vice-versa, the information is preserved. The process is reversible. Entropy is unchanged; no heat needs to be dissipated. Only an irreversible operation, he argued, increases entropy.
52%
Flag icon
Landauer and Bennett were a double act: a straight and narrow old IBM type and a scruffy hippie (in Bennett’s view, anyway). The younger man pursued Landauer’s principle by analyzing every kind of computer he could imagine, real and abstract, from Turing machines and messenger RNA to “ballistic” computers, carrying signals via something like billiard balls. He confirmed that a great deal of computation can be done with no energy cost at all. In every case, Bennett found, heat dissipation occurs only when information is erased. Erasure is the irreversible logical operation. When the head on a ...more
This highlight has been truncated due to consecutive passage length restrictions.