The Information: A History, a Theory, a Flood
Rate it:
Open Preview
Kindle Notes & Highlights
Read between August 8 - August 29, 2018
3%
Flag icon
He claimed the idea in his first telegraph patent, in 1840:
3%
Flag icon
Long afterward, information theorists calculated that they had come within 15 percent of an optimal arrangement for telegraphing English text.
4%
Flag icon
Indeed, the formulas of the African drummers sometimes preserve archaic words that have been forgotten in the everyday language. For the Yaunde, the elephant is always “the great awkward one.” The resemblance to Homeric formulas—not merely Zeus, but Zeus the cloud-gatherer; not just the sea, but the wine-dark sea—is no accident. In an oral culture, inspiration has to serve clarity and memory first. The Muses are the daughters of Mnemosyne.
8%
Flag icon
He rehearsed a long list of examples and concluded: “I beg of you, therefore, good reader, do not scorn this great labor of mine and this order as something worthless.”
13%
Flag icon
will yet venture to predict, that a time will arrive, when the accumulating labour which arises from the arithmetical application of mathematical formulae, acting as a constantly retarding force, shall ultimately impede the useful progress of the science, unless this or some equivalent method is devised for relieving it from the overwhelming incumbrance of numerical detail.
15%
Flag icon
He was a mathematical raconteur—that was no contradiction, in this time and place.
18%
Flag icon
The Chappe brothers set a pair of pendulum clocks to beat in synchrony, each with its pointer turning around a dial at relatively high speed. They experimented with this in their hometown, Brûlon, about one hundred miles west of Paris. Ignace, the sender, would wait till the pointer reached an agreed number and at that instant signal by ringing a bell or firing a gun or, more often, banging upon a casserole. Upon hearing the sound, Claude, stationed a quarter mile away, would read the appropriate number off his own clock. He could convert number to words by looking them up in a prearranged ...more
20%
Flag icon
Meanwhile, dozens of young newspapers around the nation were modernistically calling themselves “The Telegraph.” They, too, were in the far-writing business.
Adit
Telegraph newspapers predate the electric telegraph.
22%
Flag icon
For less sensitive occasions, Vail proposed using abbreviated versions of common phrases. Instead of “give my love to,” he suggested sending “gmlt.”
30%
Flag icon
Turing was encoding his machines just as Gödel had encoded the language of symbolic logic. This obliterated the distinction between data and instructions: in the end they were all numbers. For every computable number, there must be a corresponding machine number.
30%
Flag icon
Also, because every number corresponds to an encoded proposition of mathematics and logic, Turing had resolved Hilbert’s question about whether every proposition is decidable. He had proved that the Entscheidungsproblem has an answer, and the answer is no. An uncomputable number is, in effect, an undecidable proposition.
31%
Flag icon
THE WEST STREET HEADQUARTERS OF BELL LABORATORIES, WITH TRAINS OF THE HIGH LINE RUNNING THROUGH
32%
Flag icon
These sequences increasingly “look” like English. Less subjectively, it turns out that touch typists can handle them with increasing speed—another indication of the ways people unconsciously internalize a language’s statistical structure.
39%
Flag icon
Entropy thus became a physical equivalent of probability: the entropy of a given macrostate is the logarithm of the number of its possible microstates. The second law, then, is the tendency of the universe to flow from less likely (orderly) to more likely (disorderly) macrostates.
44%
Flag icon
For this bodiless replicator itself, Dawkins proposed a name. He called it the meme, and it became his most memorable invention, far more influential than his selfish genes or his later proselytizing against religiosity.
47%
Flag icon
Chaitin, as a high-school student in the Columbia Science Honors Program,
47%
Flag icon
Algorithms generate patterns. So we can gauge computability by looking at the size of the algorithm. Given a number—represented as a string of any length—we ask, what is the length of the shortest program that will generate it?
48%
Flag icon
It made sense now to say that a dynamical system produces information. If it is unpredictable, it produces a great deal of information.
49%
Flag icon
Solomonoff, Kolmogorov, and Chaitin tackled three different problems and came up with the same answer. Solomonoff was interested in inductive inference: given a sequence of observations, how can one make the best predictions about what will come next? Kolmogorov was looking for a mathematical definition of randomness: what does it mean to say that one sequence is more random than another, when they have the same probability of emerging from a series of coin flips? And Chaitin was trying to find a deep path into Gödel incompleteness by way of Turing and Shannon—as he said later, “putting ...more
50%
Flag icon
Everything we care about lies somewhere in the middle, where pattern and randomness interlace.
50%
Flag icon
The amount of work it takes to compute something had been mostly disregarded—set aside—in all the theorizing based on Turing machines, which work, after all, so ploddingly. Bennett brought it back.
50%
Flag icon
Rather, he proposed, the value of a message lies in “what might be called its buried redundancy—parts predictable only with difficulty, things the receiver could in principle have figured out without being told, but only at considerable cost in money, time, or computation.” When we value an object’s complexity, or its information content, we are sensing a lengthy hidden computation.
51%
Flag icon
What is the physical cost of logical work? “Computers,” he wrote provocatively, “may be thought of as engines for transforming free energy into waste heat and mathematical work.”
52%
Flag icon
On the contrary, it seemed that most logical operations have no entropy cost at all. When a bit flips from zero to one, or vice-versa, the information is preserved. The process is reversible. Entropy is unchanged; no heat needs to be dissipated. Only an irreversible operation, he argued, increases entropy.
52%
Flag icon
In Szilárd’s thought experiment, the demon does not incur an entropy cost when it observes or chooses a molecule. The payback comes at the moment of clearing the record, when the demon erases one observation to make room for the next. Forgetting takes work.
53%
Flag icon
Putting qubits at work together does not merely multiply their power; the power increases exponentially. In classical computing, where a bit is either-or, n bits can encode any one of 2n values.
53%
Flag icon
In December 2009 a team distributed in Lausanne, Amsterdam, Tokyo, Paris, Bonn, and Redmond, Washington, used many hundreds of machines working almost two years
Adit
TODO Look up this paper to unserstand measurement
57%
Flag icon
“The name of a man is like his shadow,” said the Viennese onomatologist Ernst Pulgram in 1954. “It is not of his substance and not of his soul, but it lives with him and by him. Its presence is not vital, nor its absence fatal.” Those were simpler times.
58%
Flag icon
As a duplicating machine, the printing press not only made texts cheaper and more accessible; its real power was to make them stable.
59%
Flag icon
Strategies emerge for coping. There are many, but in essence they all boil down to two: filter and search.
59%
Flag icon
Once a piece of information is filed, it is statistically unlikely ever to be seen again by human eyes.