A Mind at Play: How Claude Shannon Invented the Information Age
Rate it:
Open Preview
17%
Flag icon
Bell Labs “was where the future, which is what we now happen to call the present, was conceived and designed,” wrote Jon Gertner in The Idea Factory, his history of the Labs.
17%
Flag icon
Its employees were given extraordinary freedom. Figure out, a Bell researcher might be told, how “fundamental questions of physics or chemistry might someday affect communications.” Might someday—Bell researchers were encouraged to think decades down the road, to imagine how technology could radically alter the character of everyday life, to wonder how Bell might “connect all of us, and all of our new machines, together.”
17%
Flag icon
It was one thing for an industrial laboratory to hire qualified PhDs and put them to work on various pressing engineering problems. But Nobel Prizes? Pie-in-the-sky projects? Ten or twenty years of leeway? Even accounting for nostalgia, Thornton Fry’s judgment hardly seems out of place; looking back on the Labs, he called it “a fairyland company.”
17%
Flag icon
The thinkers who thrived at the Labs were those who, confronted with a nearly limitless field of questions, chose the “right” ones: the ones most fertile of breakthroughs in technique or theory, the ones that opened on broad vistas rather than dead ends. This choice of questions has always been a matter of intuition as much as erudition, the irreducible kernel of art in science.
17%
Flag icon
“I had freedom to do anything I wanted from almost the day I started,” he reflected. “They never told me what to work on.”
17%
Flag icon
mathematicians who aspired to build things rather than simply think about things.
18%
Flag icon
Fry’s hunch was that not all mathematicians wanted to write papers and chase tenure.
18%
Flag icon
“I got quite a kick,” Shannon wrote to Vannevar Bush, “when I found out that the Labs are actually using [my] relay algebra in design work and attribute a couple new circuit designs to it.” As with a tinkerer who successfully flips the switch on his latest creation, it isn’t difficult to imagine Bush reading that sentence, sitting back, and smiling with satisfaction.
19%
Flag icon
He had left a mark on men who were discerning judges of raw intellectual horsepower, and they found in him one of their own.
19%
Flag icon
As a philosopher, he considered Einstein’s work on the relativity of space and time not only a turning point in science, but a new insight into the relationship between human consciousness and the external world.
19%
Flag icon
What if the mathematical model for a message sent over telephone or telegraph wires had something in common with the models for the motion of elementary particles?
20%
Flag icon
“A kind of guilt or depression worms inside of you, and you begin to worry about not getting any ideas. . . . You’re not in contact with the experimental guys. You don’t have to think how to answer questions from the students. Nothing!”
21%
Flag icon
But I lacked that strange and wonderful creative spark that makes a good researcher. Thus I realized that there was a definite ceiling on my possibilities as a mathematics professor.”
21%
Flag icon
“Do not overestimate science, do not think that science is all that there is,” he urged students in a 1966 talk. “Do not concentrate so completely on science that there’s nobody in this room who is going to spend the next seven days without reading some poetry. I hope that there’s nobody in this room that’s going to spend the next seven days without listening to some music, some good music, some modern music, some music.”
22%
Flag icon
How many tons of explosive force must a bomb release to create a certain amount of damage to certain types of targets? In what sorts of formation should bombers fly? Should an airplane be heavily armored or should it be stripped of defenses so it can fly faster? At what depths should an anti-submarine weapon dropped from an airplane explode? How many anti-aircraft guns should be placed around a critical target? In short, precisely how should these new weapons be used to produce the greatest military payoff?
23%
Flag icon
Cryptology represents a problem of both software and hardware. The “software” can, in principle, be anything. In one well-known example, about 500 Navajo Indians were recruited in World War II to transmit coded messages because their native tongue was complex enough—and unfamiliar enough—to evade detection by the Axis powers.
24%
Flag icon
“rather like Rimsky-Korsakov’s bravura violin spectacular ‘The Flight of the Bumblebee.’
24%
Flag icon
the Japanese surrender was signed. This paper, “A Mathematical Theory of Cryptography—Case 20878,”
24%
Flag icon
The one-time pad system was the conceptual basis of Bell Labs’ Vocoder, though it was first devised as early as 1882. It requires that a coded message be preceded by the key to decode it, that the key be a secret, entirely random set of symbols the same size as the message, and that the key be used only once.
24%
Flag icon
It took Claude Shannon, and the space of more than a half century, to prove that a code constructed under these stringent (and usually impracticable) conditions would be entirely unbreakable—that perfect secrecy within a cryptographic system was, at least in theory, possible.
25%
Flag icon
Turing and I had an awful lot in common, and we would talk about that kind of question. He had already written his famous paper about Turing Machines, so called, as they call them now, Turing Machines. They didn’t call them that then. And we spent much time discussing the concepts of what’s in the human brain. How the brain is built, how it works and what can be done with machines and whether you can do anything with machines that you can do with the human brain and so on. And that kind of thing. And I had talked to him several times about my notions on Information Theory, I know, and he was ...more
25%
Flag icon
We had dreams, Turing and I used to talk about the possibility of simulating entirely the human brain, could we really get a computer which would be the equivalent of the human brain or even a lot better? And it seemed easier then than it does now maybe. We both thought that this should be possible in not very long, in ten or 15 years. Such was not the case, it hasn’t been done in thirty years.
26%
Flag icon
The ancient art of mathematics . . . does not reward speed so much as patience, cunning and, perhaps most surprising of all, the sort of gift for collaboration and improvisation that characterizes the best jazz musicians. —Gareth Cook
26%
Flag icon
S. Eliot, the latter his favorite author.
27%
Flag icon
It turns out that there were three certified geniuses at BTL [Bell Telephone Laboratories] at the same time, Claude Shannon of information theory fame, John Pierce, of communication satellite and traveling wave amplifier fame, and Barney. Apparently the three of those people were intellectually INSUFFERABLE. They were so bright and capable, and they cut an intellectual swath through that engineering community, that only a prestige lab like that could handle all three at once.
27%
Flag icon
“He didn’t have much patience with people who weren’t as smart as he was.”
28%
Flag icon
law of squares as “a fiction of the schools”: a formula built for elegance in the pages of journals (it even looked like Newton’s famous inverse-square law of gravity!),
29%
Flag icon
It was Shannon who made the final synthesis, who defined the concept of information and effectively solved the problem of noise. It was Shannon who was credited with gathering the threads into a new science. But he had important predecessors at Bell Labs, two engineers who had shaped his thinking since he discovered their work in Ann Arbor, who were the first to consider how information might be put on a scientific footing, and whom Shannon’s landmark paper singled out as pioneers.
30%
Flag icon
W = k log m W is the speed of intelligence. m is the number of “current values” that the system can transmit. A current value is a discrete signal that a telegraph system is equipped to send: the number of current values is something like the number of possible letters in an alphabet. If the system can only communicate “on” or “off,” it has two current values; if it can communicate “negative current,” “off,” and “positive current,” it has three; and if it can communicate “strong negative,” “negative,” “off,” “positive,” and “strong positive,” it has five.I Finally, k is the number of current ...more
31%
Flag icon
To choose is to kill off alternatives.
31%
Flag icon
Information measures freedom of choice.
31%
Flag icon
Given these quantities, and calling the amount of information transmitted H, we have: H = k log sn
31%
Flag icon
What is the nature of communication?
31%
Flag icon
What happens when we send a message? Is there information in a message you can’t even understand?
31%
Flag icon
measure for information, for example, helps us uncover the connections between the bandwidth of a medium, and the information in the message, and the time devoted to sending it. As Hartley showed, there is always a trade-off between these three quantities.
31%
Flag icon
We can say, from our hindsight, that if the step were obvious, it surely wouldn’t have stayed untaken for twenty years. If the step were obvious, it surely wouldn’t have been met with such astonishment. “It came as a bomb,” said Pierce.
32%
Flag icon
information measures freedom of choice: what makes messages interesting is that they are “selected from a set of possible messages.”
32%
Flag icon
First, though, Shannon saw that information science had still failed to pin down something crucial about information: its probabilistic nature.
33%
Flag icon
What does information really measure? It measures the uncertainty we overcome. It measures our chances of learning something we haven’t yet learned. Or, more specifically: when one thing carries information about another—just as a meter reading tells us about a physical quantity, or a book tells us about a life—the amount of information it carries reflects the reduction in uncertainty about the object.
33%
Flag icon
Information is stochastic. It is neither fully unpredictable nor fully determined. It unspools in roughly guessable ways. That’s why the classic model of a stochastic process is a drunk man stumbling down the street.
33%
Flag icon
walking behavior of drunks. For instance, they tend to gravitate toward lampposts.
34%
Flag icon
We do it by making our rules more restrictive. We do it by making ourselves more predictable. We do it by becoming less informative. And these stochastic processes are just a model of the unthinking choices we make whenever we speak a sentence, whenever we send any message at all.
34%
Flag icon
It turns out that some of the most childish questions about the world—“Why don’t apples fall upwards?”—are also the most scientifically productive.
34%
Flag icon
If there is a pantheon of such absurd and revealing questions, it ought to include a space for Shannon’s: “Why doesn...
This highlight has been truncated due to consecutive passage length restrictions.
35%
Flag icon
his interest in the unexamined statistical nature of messages, and his intuition that a mastery of this nature might extend our powers of communication. He
35%
Flag icon
“Roughly, redundancy means that more symbols are transmitted in a message than are actually needed to bear the information.”
35%
Flag icon
“When we write English, half of what we write is determined by the structure of the language and half is chosen freely.”
35%
Flag icon
So the speed with which we can communicate over a given channel depends on how we encode our messages: how we package them, as compactly as possible, for shipment.
35%
Flag icon
Shannon’s first theorem proves that there is a point of maximum compactness for every message source. We have reached the limits of communication when every symbol tells us something new.
36%
Flag icon
In fact, when transmission speed is such a valuable commodity (consider everything you can’t do with a dial-up modem), we have to do better. And if we bear in mind the statistics of this particular language, we can. It’s just a matter of using the fewest bits on the most common letters, and using the most cumbersome strings on the rarest ones.