A Mind at Play: How Claude Shannon Invented the Information Age
Rate it:
Open Preview
2%
Flag icon
In the 1920s, when Claude was a boy, some three million farmers talked through networks like these, wherever the phone company found it unprofitable to build. It was America’s folk grid. Better networks than Claude’s carried voices along the fences, and kitchens and general stores doubled as switchboards.
13%
Flag icon
More to the point, it was a matter of deep conviction for Bush that specialization was the death of genius. “In these days, when there is a tendency to specialize so closely, it is well for us to be reminded that the possibilities of being at once broad and deep did not pass with Leonardo da Vinci or even Benjamin Franklin,” Bush said in a speech at MIT. “Men of our profession—we teachers—are bound to be impressed with the tendency of youths of strikingly capable minds to become interested in one small corner of science and uninterested in the rest of the world. . . . It is unfortunate when a ...more
13%
Flag icon
in some ways, eugenics are to modern genetics as alchemy is to chemistry, the disreputable relative in the attic.
13%
Flag icon
Each trait was coded like a book in a library. Searching for chess-playing ability would take Shannon to file 4598: 4 for mental trait, 5 for general mental ability, 9 for game-playing ability, and 8 for chess.
15%
Flag icon
The simplest explanation for his failure to publish is just that his attention did what it did so often: it wandered away.
23%
Flag icon
Cryptography was the war’s white noise: it was ubiquitous, and yet only those paying the closest attention could pick it up. It was one of the least understood components of the war machine. Compared to, say, the nuclear bomb, a visible and white-hot expression of the power of physics, the products of cryptographic analysts were arcane and mysterious—and kept classified for a generation or more.
24%
Flag icon
Shannon’s later work—but it also provided the first-ever proof of a critical concept in cryptology: the “one-time pad.”
26%
Flag icon
And he’d invented this wonderful command. See, in those days they were working with individual commands. And the idea was to discover good commands. And I said, what is the command? And he said, the command is put a pulse to the hooter, put a pulse to the hooter. Now let me translate that. A hooter . . . in England is a loudspeaker. And by putting a pulse to it, it would just be put a pulse to a hooter. Now what good is this crazy command? Well, the good of this command is that if you’re in a loop you can have this command in that loop and every time it goes around the loop it will put a pulse ...more
30%
Flag icon
There’s only meaning where there’s prior agreement about our symbols. And all communication is like this, from waves sent over electrical wires, to the letters agreed upon to symbolize words, to the words agreed upon to symbolize things.
31%
Flag icon
The information value of a symbol depends on the number of alternatives that were killed off in its choosing. Symbols from large vocabularies bear more information than symbols from small ones. Information measures freedom of choice.
Matthew Royal
This theory of information sounds uncomputable. For an unbound set like words, the unknown size of the vocabulary, or changing in size as a function of time or subculture implies a variable, unknowable amount of information for a static message... Which actually meshes well with postmodernist interpretations of a "text."
32%
Flag icon
New sciences demand new units of measurement—
33%
Flag icon
This was the hunch that Shannon had suggested to Hermann Weyl in Princeton in 1939, and which he had spent almost a decade building into theory: Information is stochastic. It is neither fully unpredictable nor fully determined.
35%
Flag icon
Shannon guessed that the world’s wealth of English text could be cut in half with no loss of information: “When we write English, half of what we write is determined by the structure of the language and half is chosen freely.” Later on, his estimate of redundancy rose as high as 80 percent: only one in five characters actually bear information.
35%
Flag icon
The most efficient message would actually resemble a string of random text: each new symbol would be as informative as possible, and thus as surprising as possible. Not a single symbol would be wasted. Of course, the messages that we want to send one another—whether telegraphs or TV broadcasts—do “waste” symbols all the time. So the speed with which we can communicate over a given channel depends on how we encode our messages: how we package them, as compactly as possible, for shipment. Shannon’s first theorem proves that there is a point of maximum compactness for every message source.
35%
Flag icon
It was one of the beauties of a physical idea of information, a bit to stand among meters and grams: proof that the efficiency of our communication depends not just on the qualities of our media of talking, on the thickness of a wire or the frequency range of a radio signal, but on something measurable, pin-downable, in the message itself.
36%
Flag icon
Shannon proposed an unsettling inversion. Ignore the physical channel and accept its limits: we can overcome noise by manipulating our messages. The answer to noise is not in how loudly we speak, but in how we say what we say.