More on this book
Community
Kindle Notes & Highlights
by
James Gleick
Read between
August 8 - August 29, 2018
He claimed the idea in his first telegraph patent, in 1840:
Long afterward, information theorists calculated that they had come within 15 percent of an optimal arrangement for telegraphing English text.
Indeed, the formulas of the African drummers sometimes preserve archaic words that have been forgotten in the everyday language. For the Yaunde, the elephant is always “the great awkward one.” The resemblance to Homeric formulas—not merely Zeus, but Zeus the cloud-gatherer; not just the sea, but the wine-dark sea—is no accident. In an oral culture, inspiration has to serve clarity and memory first. The Muses are the daughters of Mnemosyne.
He rehearsed a long list of examples and concluded: “I beg of you, therefore, good reader, do not scorn this great labor of mine and this order as something worthless.”
will yet venture to predict, that a time will arrive, when the accumulating labour which arises from the arithmetical application of mathematical formulae, acting as a constantly retarding force, shall ultimately impede the useful progress of the science, unless this or some equivalent method is devised for relieving it from the overwhelming incumbrance of numerical detail.
He was a mathematical raconteur—that was no contradiction, in this time and place.
The Chappe brothers set a pair of pendulum clocks to beat in synchrony, each with its pointer turning around a dial at relatively high speed. They experimented with this in their hometown, Brûlon, about one hundred miles west of Paris. Ignace, the sender, would wait till the pointer reached an agreed number and at that instant signal by ringing a bell or firing a gun or, more often, banging upon a casserole. Upon hearing the sound, Claude, stationed a quarter mile away, would read the appropriate number off his own clock. He could convert number to words by looking them up in a prearranged
...more
For less sensitive occasions, Vail proposed using abbreviated versions of common phrases. Instead of “give my love to,” he suggested sending “gmlt.”
Turing was encoding his machines just as Gödel had encoded the language of symbolic logic. This obliterated the distinction between data and instructions: in the end they were all numbers. For every computable number, there must be a corresponding machine number.
Also, because every number corresponds to an encoded proposition of mathematics and logic, Turing had resolved Hilbert’s question about whether every proposition is decidable. He had proved that the Entscheidungsproblem has an answer, and the answer is no. An uncomputable number is, in effect, an undecidable proposition.
THE WEST STREET HEADQUARTERS OF BELL LABORATORIES, WITH TRAINS OF THE HIGH LINE RUNNING THROUGH
These sequences increasingly “look” like English. Less subjectively, it turns out that touch typists can handle them with increasing speed—another indication of the ways people unconsciously internalize a language’s statistical structure.
Entropy thus became a physical equivalent of probability: the entropy of a given macrostate is the logarithm of the number of its possible microstates. The second law, then, is the tendency of the universe to flow from less likely (orderly) to more likely (disorderly) macrostates.
For this bodiless replicator itself, Dawkins proposed a name. He called it the meme, and it became his most memorable invention, far more influential than his selfish genes or his later proselytizing against religiosity.
Chaitin, as a high-school student in the Columbia Science Honors Program,
Algorithms generate patterns. So we can gauge computability by looking at the size of the algorithm. Given a number—represented as a string of any length—we ask, what is the length of the shortest program that will generate it?
It made sense now to say that a dynamical system produces information. If it is unpredictable, it produces a great deal of information.
Solomonoff, Kolmogorov, and Chaitin tackled three different problems and came up with the same answer. Solomonoff was interested in inductive inference: given a sequence of observations, how can one make the best predictions about what will come next? Kolmogorov was looking for a mathematical definition of randomness: what does it mean to say that one sequence is more random than another, when they have the same probability of emerging from a series of coin flips? And Chaitin was trying to find a deep path into Gödel incompleteness by way of Turing and Shannon—as he said later, “putting
...more
Everything we care about lies somewhere in the middle, where pattern and randomness interlace.
The amount of work it takes to compute something had been mostly disregarded—set aside—in all the theorizing based on Turing machines, which work, after all, so ploddingly. Bennett brought it back.
Rather, he proposed, the value of a message lies in “what might be called its buried redundancy—parts predictable only with difficulty, things the receiver could in principle have figured out without being told, but only at considerable cost in money, time, or computation.” When we value an object’s complexity, or its information content, we are sensing a lengthy hidden computation.
What is the physical cost of logical work? “Computers,” he wrote provocatively, “may be thought of as engines for transforming free energy into waste heat and mathematical work.”
On the contrary, it seemed that most logical operations have no entropy cost at all. When a bit flips from zero to one, or vice-versa, the information is preserved. The process is reversible. Entropy is unchanged; no heat needs to be dissipated. Only an irreversible operation, he argued, increases entropy.
In Szilárd’s thought experiment, the demon does not incur an entropy cost when it observes or chooses a molecule. The payback comes at the moment of clearing the record, when the demon erases one observation to make room for the next. Forgetting takes work.
Putting qubits at work together does not merely multiply their power; the power increases exponentially. In classical computing, where a bit is either-or, n bits can encode any one of 2n values.
“The name of a man is like his shadow,” said the Viennese onomatologist Ernst Pulgram in 1954. “It is not of his substance and not of his soul, but it lives with him and by him. Its presence is not vital, nor its absence fatal.” Those were simpler times.
As a duplicating machine, the printing press not only made texts cheaper and more accessible; its real power was to make them stable.
Strategies emerge for coping. There are many, but in essence they all boil down to two: filter and search.
Once a piece of information is filed, it is statistically unlikely ever to be seen again by human eyes.