A Mind at Play: How Claude Shannon Invented the Information Age
Rate it:
Open Preview
Read between November 17 - November 25, 2017
1%
Flag icon
Geniuses are the luckiest of mortals because what they must do is the same as what they most want to do and, even if their genius is unrecognized in their lifetime, the essential earthly reward is always theirs, the certainty that their work is good and will stand the test of time. One suspects that the geniuses will be least in the Kingdom of Heaven—if, indeed, they ever make it; they have had their reward. —W. H. AUDEN
9%
Flag icon
Logic, just like a machine, was a tool for democratizing force: built with enough precision and skill, it could multiply the power of the gifted and the average alike.
9%
Flag icon
It was one thing to compare logic to a machine—it was another entirely to show that machines could do logic.
11%
Flag icon
Less than a decade after Shannon’s paper, the great analog machine, the differential analyzer, was effectively obsolete, replaced by digital computers that could do its work literally a thousand times faster, answering questions in real time, driven by thousands of logic gates that each acted as “an all-or-none device.” The medium now was vacuum tubes, not switches—but the design was a direct descendant of Shannon’s discovery.
12%
Flag icon
In that light, a story from Hapgood, the MIT historian: “Years ago an engineer told me a fantasy he thought threw some light on the ends of engineering, or at least on those underlying his own labors. A flying saucer arrives on Earth and the crew starts flying over cities and dams and canals and highways and grids of power lines; they follow cars on the roads and monitor the emissions of TV towers. They beam up a computer into their saucer, tear it down, and examine it. ‘Wow,’ one of them finally exclaims. ‘Isn’t nature incredible!?’ ”
13%
Flag icon
More to the point, it was a matter of deep conviction for Bush that specialization was the death of genius. “In these days, when there is a tendency to specialize so closely, it is well for us to be reminded that the possibilities of being at once broad and deep did not pass with Leonardo da Vinci or even Benjamin Franklin,” Bush said in a speech at MIT. “Men of our profession—we teachers—are bound to be impressed with the tendency of youths of strikingly capable minds to become interested in one small corner of science and uninterested in the rest of the world. . . . It is unfortunate when a ...more
13%
Flag icon
Few scientists had compiled better data on heredity and inheritance than eugenicists—in some ways, eugenics are to modern genetics as alchemy is to chemistry, the disreputable relative in the attic.
14%
Flag icon
Shannon offered genuinely new eyes on old problems, and where his thought was original, it was almost unconsciously so. Like something of a genetic Joseph Conrad, he could reach heights of creativity in an adopted language because he had missed learning its clichés in his youth.
17%
Flag icon
Figure out, a Bell researcher might be told, how “fundamental questions of physics or chemistry might someday affect communications.” Might someday—Bell researchers were encouraged to think decades down the road, to imagine how technology could radically alter the character of everyday life, to wonder how Bell might “connect all of us, and all of our new machines, together.” One Bell employee of a later era summarized it like this: “When I first came there was the philosophy: look, what you’re doing might not be important for ten years or twenty years, but that’s fine, we’ll be there then.”
17%
Flag icon
Claude Shannon was one of those who thrived. Among the institutions that had dotted the landscape of Shannon’s life, it’s hard to imagine a place better suited to his mix of passions and particular working style than the Bell Laboratories of the 1940s. “I had freedom to do anything I wanted from almost the day I started,” he reflected. “They never told me what to work on.”
18%
Flag icon
It’s taken as a given in our era that a high-level math mind—a “quant”—can find gainful employment. But that wasn’t always the case, and especially not in the world of elite mathematics in the early twentieth century. What was valued in the highest levels of mathematics had precious little application outside of it. Solutions to abstract problems won glory, and thus whole careers were devoted to chasing solutions to problems like the Riemann hypothesis, the Poincaré and Collatz conjectures, and Fermat’s last theorem. These were the math world’s greatest puzzles, and the fact that decades had ...more
18%
Flag icon
“The typical mathematician,” Fry observed, is not the sort of man to carry on an industrial project. He is a dreamer, not much interested in things or the dollars they can be sold for. He is a perfectionist, unwilling to compromise; idealizes to the point of impracticality; is so concerned with the broad horizon that he cannot keep his eye on the ball.
20%
Flag icon
In truth, the Institute for Advanced Study proved unhealthy for Shannon. For some, it was a land of academic lotus-eating, an island where the absence of the ordinary worries of the job—students, deadlines, publication pressure—proved enervating rather than invigorating. The physicist Richard Feynman, who was working on his doctorate at Princeton while Shannon was at the IAS down the street, observed the inertia firsthand: “A kind of guilt or depression worms inside of you, and you begin to worry about not getting any ideas. . . . You’re not in contact with the experimental guys. You don’t ...more
21%
Flag icon
Below that ceiling, however, Weaver was a heterodox thinker whose passions ran the gamut; he published or worked on problems in engineering, mathematics, machine learning, translation, biology, the natural sciences, and probabilities. But unlike many of his colleagues, he believed in a world outside the confines of science and math, and he avoided the all-too-common insularity of the fields he worked in and the thinkers who worked in them. “Do not overestimate science, do not think that science is all that there is,” he urged students in a 1966 talk.
21%
Flag icon
Fire control was, essentially, the study of hitting moving targets. The targets were anything and everything the enemy could hurl through the air to cause damage—planes, rockets, ballistics. Imagine a gun firing a single shot at a target. Now imagine that the gun is the size of a two-story house, that it is placed on a moving Navy ship in the middle of the ocean, and that it is trying to shoot down an enemy fighter moving at 350 miles per hour. That’s a rough description of the challenge of fire control, and it was put to the mathematics group at Bell Labs, among others, to design the machines ...more
22%
Flag icon
David Mindell, a historian of technology, put it like this: The wartime efforts of Bell Labs in fire control contributed to a new vision of technology, a vision that treated different types of machinery (radar, amplifiers, electric motors, computers) in analytically similar terms—paving the way for information theory, systems engineering, and classical control theory. These efforts produced not only new weapons but also a vision of signals and systems. Through ideas and through people, this vision diffused into engineering culture and solidified as the technical and conceptual foundations of ...more
22%
Flag icon
This has not been a scientist’s war; it has been a war in which all have had a part. The scientists, burying their old professional competition in the demand of a common cause, have shared greatly and learned much. —Vannevar Bush
22%
Flag icon
As Fred Kaplan explained in his history of wartime science, “It was a war in which the talents of scientists were exploited to an unprecedented, almost extravagant degree.” There were urgent questions that needed answers, and the scientifically literate were uniquely equipped to answer them. Kaplan cataloged just a few: How many tons of explosive force must a bomb release to create a certain amount of damage to certain types of targets? In what sorts of formation should bombers fly? Should an airplane be heavily armored or should it be stripped of defenses so it can fly faster? At what depths ...more
22%
Flag icon
One of the most insightful surveys of wartime mathematics comes from J. Barkley Rosser, a University of Wisconsin professor who interviewed some 200 mathematicians who, like him, had been pressed into national service. Rosser concluded that mathematicians acted as a kind of accelerant, helpful in speeding up research and development that would otherwise have been painfully manual and slow. The attitude of many with the problems they were asked to solve was that the given problem was not really mathematics but, since an answer was needed, urgently and quickly, they got on with it. . . . Without ...more
23%
Flag icon
There was something else, too: as Rosser suggested, the math problems brought forth by the war were hardly math at all—or, at least, they were beneath anyone considered worth working on them. The defense establishment had, in a sense, overinvested in brainpower. In Rosser’s words, one of his colleagues insisted to his dying day . . . that he never did an iota of mathematics during the War. True enough, the problems were mostly very pedestrian stuff, as mathematics. I was never required to appeal to the Gödel incompleteness theorem, or use the ergodic theorem, or any other key results in that ...more
23%
Flag icon
The war for signals intelligence was as much about code-making as it was about codebreaking, as illustrated by one famous and tragic story. On the morning of December 7, 1941, George Marshall, the Army’s chief of staff, had an important message to send to his Pacific Command: Japan had decided that it could no longer mediate its differences with the United States through politics; war was likely. But how to transmit this information? The only system available to the nation’s top military and political leaders had long been considered insecure. The message was sent instead by the comparatively ...more
24%
Flag icon
As in other areas of Shannon’s life, his most important work in cryptography yielded a rigorous, theoretical underpinning for many of a field’s key concepts. Shannon’s exposure to day-to-day cryptographic work during the war, it seems, was important—but its primary purpose was as grist for a paper that would only be published in classified form on September 1, 1945—one day before the Japanese surrender was signed. This paper, “A Mathematical Theory of Cryptography—Case 20878,” contained important antecedents of Shannon’s later work—but it also provided the first-ever proof of a critical ...more
26%
Flag icon
The ancient art of mathematics . . . does not reward speed so much as patience, cunning and, perhaps most surprising of all, the sort of gift for collaboration and improvisation that characterizes the best jazz musicians. —Gareth Cook
27%
Flag icon
It turns out that there were three certified geniuses at BTL [Bell Telephone Laboratories] at the same time, Claude Shannon of information theory fame, John Pierce, of communication satellite and traveling wave amplifier fame, and Barney. Apparently the three of those people were intellectually INSUFFERABLE. They were so bright and capable, and they cut an intellectual swath through that engineering community, that only a prestige lab like that could handle all three at once.
27%
Flag icon
Shannon’s response to colleagues who could not keep pace was simply to forget about them. “He never argued his ideas. If people didn’t believe in them, he ignored those people,” McMillan told Gertner.
28%
Flag icon
First, communication is a war against noise. Noise is interference between telephone wires, or static that interrupts a radio transmission, or a telegraph signal corrupted by failing insulation and decaying on its way across an ocean. It is the randomness that creeps into our conversations, accidentally or deliberately, and blocks our understanding. Across short distances, or over relatively uncomplicated media—Bell calling Watson from the next room, or a landline telegraph from London to Manchester—noise could be coped with. But as distances increased and the means of sending and storing ...more
30%
Flag icon
There’s only meaning where there’s prior agreement about our symbols. And all communication is like this, from waves sent over electrical wires, to the letters agreed upon to symbolize words, to the words agreed upon to symbolize things.
30%
Flag icon
The real measure of information is not in the symbols we send—it’s in the symbols we could have sent, but did not.
30%
Flag icon
To send a message is to make a selection from a pool of possible symbols, and “at each selection there are eliminated all of the other symbols which might have been chosen.” To choose is to kill off alternatives. We see this most clearly, Hartley observed, in the cases in which messages happen to bear meaning. “For example, in the sentence, ‘Apples are red,’ the first word eliminated other kinds of fruit and all other objects in general. The second directs attention to some property or condition of apples, and the third eliminates other possible colors.” This rolling process of elimination ...more
32%
Flag icon
It’s entirely fair to design a new measurement with human needs in mind, as long as the measurement is internally consistent. By comparison, there’s no natural reason why a single degree Celsius should cover a wider range of temperature than a single degree Fahrenheit—it’s just that many people find it convenient to think of water’s freezing point as 0° and its boiling point as 100° and define the degrees in between accordingly. Choosing whether to think of information as a quantity that increases exponentially or linearly with message length is a matter of human convenience in the same way, ...more
32%
Flag icon
Every system of communication—not just the ones existing in 1948, not just the ones made by human hands, but every system conceivable—could be reduced to a radically simple essence. • The information source produces a message. • The transmitter encodes the message into a form capable of being sent as a signal. • The channel is the medium through which the signal passes. • The noise source represents the distortions and corruptions that afflict the signal on its way to the receiver. • The receiver decodes the message, reversing the action of the transmitter. • The destination is the recipient ...more
33%
Flag icon
What does information really measure? It measures the uncertainty we overcome. It measures our chances of learning something we haven’t yet learned. Or, more specifically: when one thing carries information about another—just as a meter reading tells us about a physical quantity, or a book tells us about a life—the amount of information it carries reflects the reduction in uncertainty about the object.
33%
Flag icon
Information is stochastic. It is neither fully unpredictable nor fully determined.
33%
Flag icon
That’s why the classic model of a stochastic process is a drunk man stumbling down the street. He doesn’t walk in the respectably straight line that would allow us to predict his course perfectly. Each lurch looks like a crapshoot. But watch him for long enough and we’ll see patterns start to emerge from his stumble, patterns that we could work out statistically if we cared to.
34%
Flag icon
The codes that Shannon tested for Roosevelt and that Turing cracked for Churchill were more convoluted still. But in the end, codebreaking remained possible, and remains so, because every message runs up against a basic reality of human communication. It always involves redundancy; to communicate is to make oneself predictable.
35%
Flag icon
Every human language is highly redundant. From the dispassionate perspective of the information theorist, the majority of what we say—whether out of convention, or grammar, or habit—could just as well go unsaid. In his theory of communication, Shannon guessed that the world’s wealth of English text could be cut in half with no loss of information: “When we write English, half of what we write is determined by the structure of the language and half is chosen freely.” Later on, his estimate of redundancy rose as high as 80 percent: only one in five characters actually bear information.
36%
Flag icon
If we could not compress our messages, a single audio file would take hours to download, streaming Web video would be impossibly slow, and hours of television would demand a bookshelf of tapes, not a small box of discs. Because we can compress our messages, video files can be compacted to just a twentieth of their size. All of this communication—faster, cheaper, more voluminous—rests on Shannon’s realization of our predictability. All of that predictability is fat to be cut; since Shannon, our signals have traveled light.
36%
Flag icon
was something radically new.IV For engineering professor James Massey, it was this promise above all that made Shannon’s theory “Copernican”: Copernican in the sense that it productively stood the obvious on its head and revolutionized our understanding of the world. Just as the sun “obviously” orbited the earth, the best answer to noise “obviously” had to do with physical channels of communication, with their power and signal strength. Shannon proposed an unsettling inversion. Ignore the physical channel and accept its limits: we can overcome noise by manipulating our messages. The answer to ...more
37%
Flag icon
As long as we respect the speed limit of the channel, there is no limit to our accuracy, no limit to the amount of noise through which we can make ourselves heard. Yes, overcoming more errors, or representing more characters, would demand more complex codes. So would combining the advantages of codes that compress and codes that guard against error: that is, reducing a message to bits as efficiently as possible, and then adding the redundancy that protects its accuracy. Coding and decoding would still exact their cost in effort and time. But Shannon’s proof stood: there is always an answer. ...more
37%
Flag icon
Shannon approached the great man with his idea of information-as-resolved-uncertainty—which would come to stand at the heart of his work—and with an unassuming question. What should he call this thing? Von Neumann answered at once: say that information reduces “entropy.” For one, it was a good, solid physics word. “And more importantly,” he went on, “no one knows what entropy really is, so in a debate you will always have the advantage.” Almost certainly, this conversation never happened. But great science tends to generate its own lore, and the story is almost coeval with Shannon’s paper.
38%
Flag icon
In an unpublished spoof written a year later, Shannon imagined the damage his methods would do if they fell into the wrong hands. It seems that an evil Nazi scientist, Dr. Hagen Krankheit, had escaped Germany with a prototype of his Müllabfuhrwortmaschine, a fearsome weapon of war “anticipated in the work . . . of Dr. Claude Shannon.” Krankheit’s machine used the principles of randomized text to totally automate the propaganda industry. By randomly stitching together agitprop phrases in a way that approximated human language, the Müllabfuhrwortmaschine could produce an endless flood of ...more
38%
Flag icon
The link between information and entropy was made explicit in Shannon’s paper. But a connection between information and physics was first suggested, as early as 1929, by the Hungarian physicist Leo Szilard. Briefly, Szilard resolved an old puzzle in the physics of heat: the Second Law of Thermodynamics says that entropy is constantly increasing, but what if we imagined a microscopic and intelligent being, which James Clerk Maxwell had dubbed a “demon,”that tried to decrease entropy by sorting hot molecules from cold? Would that contradict the Second Law? Szilard showed that it would not: the ...more
38%
Flag icon
The two had met in the fall of 1948 and discussed the theory. Weaver, perhaps through an excess of enthusiasm, foresaw a world in which information theory could help computers fight the Cold War and enable instantaneous rendering of Soviet documents into English. Inspired, he praised Shannon’s work with exuberance to the head of the Rockefeller Foundation, Chester Barnard. In early 1949, Weaver sent Barnard his own layman’s translation of “A Mathematical Theory of Communication.”
39%
Flag icon
As Lee DuBridge, the lab’s director, would later quip, “Radar won the war; the atom bomb ended it.” This was the world of fighting man’s physics.
39%
Flag icon
The publication of The Mathematical Theory of Communication stands as one of the defining moments in the history of information theory, and not only on account of its commercial success. Even the title sent an important message: in the span of a year, Shannon’s original “A Mathematical Theory of Communication” had become the definitive “The Mathematical Theory of Communication.” As electrical engineer and information theorist Robert Gallager pointed out, the subtle change in the article’s context, from one of several articles in a technical journal to centerpiece of a book, was a mark of ...more
39%
Flag icon
Doob was a fierce critic of anything he regarded as flabby thinking. Doob was open about the fact that he was, perhaps too frequently, looking for trouble. Asked why he became interested in mathematics in the first place, he answered: I have always wanted to understand what I was doing, and why I was doing it, and I have often been a pest because I have objected when what I heard or read was not to be taken literally. The boy who noticed that the emperor wasn’t dressed and objected loudly has always been my model. Mathematics seemed to match my psychology, a mistake reflecting the fact that ...more
39%
Flag icon
Above all, Doob professed loyalty to the “austere and often abstruse” world of pure mathematics. If applied mathematics concerns itself with concrete questions, pure mathematics exists for its own sake. Its cardinal questions are not “How do we encrypt a telephone conversation?” but rather “Are there infinitely many twin primes?” or “Does every true mathematical statement have a proof?” The divorce between the two schools has ancient origins. Historian Carl Boyer traces it to Plato, who regarded mere computation as suitable for a merchant or a general, who “must learn the art of numbers or he ...more
41%
Flag icon
Hans Freudenthal remembered, In appearance and behaviour, Norbert Wiener was a baroque figure, short, rotund, and myopic, combining these and many qualities in extreme degree. His conversation was a curious mixture of pomposity and wantonness. He was a poor listener. . . . He spoke many languages but was not easy to understand in any of them. He was a famously bad lecturer.
41%
Flag icon
“Wiener, in a sense, did a lot to push the idea of cybernetics, which is a somewhat vague idea, and got a lot of worldwide publicity for it,” said Stanford’s Thomas Kailath. “But that wasn’t Shannon’s personality at all. Wiener loved the publicity, and Shannon could not have cared less.” The popular success of Cybernetics launched a debate over priority within the small clique of mathematicians who wanted to know whether Wiener or Shannon could rightly claim credit for information theory. It also gave rise to a dispute over whether or not Wiener—whose chapter on information as a statistical ...more
42%
Flag icon
Nearly all who knew them testified to how good a match Betty was for Claude Shannon—in every sense. It wasn’t just the joy he found in her company, though he did. Betty and Claude became professional partners, as well. Albert Einstein famously said of his wife, Mileva Maric, “I need my wife. She solves all the mathematical problems for me.” Claude’s work was very much his own, but there’s no denying Betty’s help in bringing it to fruition; she became one of his closest advisers on mathematical matters. She looked up references, took down his thoughts, and, importantly, edited his written work.
« Prev 1