More on this book
Community
Kindle Notes & Highlights
by
Jimmy Soni
Read between
August 27, 2017 - January 29, 2018
If Google’s “20 percent time”—the practice that frees one-fifth of a Google employee’s schedule to devote to blue-sky projects—seems like a West Coast indulgence, then Bell Labs’ research operation, buoyed by a federally approved monopoly and huge profit margins, would appear gluttonous by comparison.
This choice of questions has always been a matter of intuition as much as erudition, the irreducible kernel of art in science.
As Fred Kaplan explained in his history of wartime science, “It was a war in which the talents of scientists were exploited to an unprecedented, almost extravagant degree.” There were urgent questions that needed answers, and the scientifically literate were uniquely equipped to answer them.
hundreds of the world’s top mathematical minds put their personal research aside, swallowed various degrees of pride, and gathered at the outposts of Los Alamos, Bletchley Park, Tuxedo Park—and Bell Labs,
The secrecy, the intensity, the drudgery, the obligatory teamwork—all of it seems to have gotten to him in a deeply personal way.
That’s the essence of cryptology: a series of substitutions, trades of one letter or word for another letter or word (or language).
So secret was SIGSALY that, even as a member of the project team, Shannon was not told what all his number-crunching was for. But the work gave him a window into the world of encoded speech, transmission of information, and cryptography—a synthesis that, at that moment in history, may not have taken place anywhere other than at Bell Labs.
the fiasco of the telegraph helped to crystallize three enduring lessons that would remain at the heart of communications science long after its details were forgotten, and long after the specific problem of transatlantic telegraphy had been tolerably solved.
First, communication is a war against noise.
Second, there are limits to brute force.
Third, what hope there was of doing better lay in investigating the boundaries between the hard world of physics and the invisible world of messages.
Information was old. A science of information was just beginning to stir.
It was Shannon who made the final synthesis, who defined the concept of information and effectively solved the problem of noise.
And it seemed that a greater range of frequencies imposed on top of one another, a greater “bandwidth,” was needed to generate the more interesting and complex waves that could carry richer information. To efficiently carry a phone conversation, the Bell network needed frequencies ranging from about 200 to 3,200 hertz, or a bandwidth of 3,000 hertz. Telegraphy required less; television would require 2,000 times more. Nyquist showed how the bandwidth of any communications channel provided a cap on the amount of “intelligence” that could pass through it at a given speed.
A continuous signal still varied smoothly in amplitude, but you could also represent that signal as a series of samples, or discrete time-slices—and within the limit of a given bandwidth, no one would be able to tell the difference.
it showed that “the world of technical communications is essentially discrete or ‘digital.’ ”
Those four paragraphs were, it turned out, a first crack at explaining the relationship between the physical properties of a channel and the speed with which it could transmit intelligence. It was a step beyond
In other words, Nyquist showed that the speed at which a telegraph could transmit intelligence depended on two factors: the speed at which it could send signals, and the number of “letters” in its vocabulary.
At the same time, his way of defining intelligence—“different letters, figures, etc.”—remained distressingly vague. Behind the letters and figures there was—what, exactly?
Shannon spent much of his life working with the conceptual tools that Hartley built, and for the better part of his life, much of his public identity—“Claude Shannon, Father of Information Theory”—was bound up in having been the one who extended Hartley’s ideas far beyond what Hartley, or anyone, could have imagined.
Hartley’s interests in communications networks were more promiscuous than Nyquist’s: he was in search of a single framework that could encompass the information-transmitting power of any medium—a
In this, Hartley formalized an intuition already wired into the phone company—which was, after all, in the business of transmission, not interpretation.
The information value of a symbol depends on the number of alternatives that were killed off in its choosing. Symbols from large vocabularies bear more information than symbols from small ones. Information measures freedom of choice.
That explains what the logarithm is doing in Hartley’s formula (and Nyquist’s): it’s converting an exponential change into a linear one. For Hartley, this was a matter of “practical engineering value.”
These were powerful questions in their own right. But in all the generations of human communication, those questions were posed with urgency and rigor just then because the answers had suddenly grown exceptionally valuable.
our sudden skill at communicating had outstripped our knowledge of communication itself. And whether in disaster—a fried cable—or merely an inconvenience—the flicker and blur of the first televisions—that ignorance exacted its toll.
In this notion that even a picture can be quantified, there’s an insight into information’s radically utilitarian premises, its almost Faustian exchange.
A measure for information, for example, helps us uncover the connections between the bandwidth of a medium, and the information in the message, and the time devoted to sending it.
Treating information, bandwidth, and time as three precise, swappable quantities could show which ideas for sending messages were “within the realm of physical possibility”—and which shouldn’t even be attempted.
This, then, was roughly where information sat when Claude Shannon picked up the thread. What began in the nineteenth century as an awareness that we might speak to one another more accurately at a distance if we could somehow quantify our messages had—almost—ripened into a new science. Each step was a step into higher abstraction. Information was the electric flow through a wire. Information was a number of characters sent by a telegraph. Information was choice among symbols. At each iteration, the concrete was falling away.
So from Hartley to Shannon, said Bell Labs’ John Pierce, the science of information “appears to have taken a prolonged and comfortable rest.” Blame Hartley’s relativity fixation, perhaps. Or blame the war—a war that unleashed tremendous applications in plane-tracking robot bombs and digital telephony, in code making and codebreaking and computing, but a war that saw few scientists with the time or incentive to step back and ask what had been learned about communication in general. Or simply blame the fact that the next and decisive step after Hartley could only be found with genius and time.
...more
Shannon was arguably the first to conceive of our genes as information bearers, an imaginative leap that erased the border between mechanical, electronic, and biological messages.
What does information really measure? It measures the uncertainty we overcome. It measures our chances of learning something we haven’t yet learned. Or, more specifically: when one thing carries information about another—just as a meter reading tells us about a physical quantity, or a book tells us about a life—the amount of information it carries reflects the reduction in uncertainty about the object.
nearly all the messages that interest engineers do come with implicit rules, are something less than free, and Shannon taught engineers how to take huge advantage of this fact.
If there is a pantheon of such absurd and revealing questions, it ought to include a space for Shannon’s: “Why doesn’t anyone say XFOML RXKHRJFFJUJ?”
But in the end, codebreaking remained possible, and remains so, because every message runs up against a basic reality of human communication. It always involves redundancy; to communicate is to make oneself predictable.
It seems that an evil Nazi scientist, Dr. Hagen Krankheit, had escaped Germany with a prototype of his Müllabfuhrwortmaschine, a fearsome weapon of war “anticipated in the work . . . of Dr. Claude Shannon.” Krankheit’s machine used the principles of randomized text to totally automate the propaganda industry. By randomly stitching together agitprop phrases in a way that approximated human language, the Müllabfuhrwortmaschine could produce an endless flood of demoralizing statements. On one trial run, it spat out “Subversive elements were revealed to be related by marriage to a well-known
...more
Weaver, perhaps through an excess of enthusiasm, foresaw a world in which information theory could help computers fight the Cold War and enable instantaneous rendering of Soviet documents into English. Inspired, he praised Shannon’s work with exuberance to the head of the Rockefeller Foundation,
Funded by Alfred Lee Loomis, the intensely private millionaire financier, attorney, and self-taught physicist, the lab was initially bankrolled entirely by Loomis himself. It created most of the radar systems used to identify German U-boats—and its network of scientists and technicians became much of the nucleus of the Manhattan Project.
“Radar won the war; the atom bomb ended it.” This was the world of fighting man’s physics.
“There is a tale told of him that when one of his students asked of what use was the study of geometry, Euclid asked his slave to give the student threepence, ‘since he must make gain of what he learns.’
The problem, in hindsight, might not have been Doob’s misunderstanding of the mathematics at work; rather, he didn’t understand that Shannon’s math was a means to an end.
accomplished as Doob was, the gaps in Shannon’s paper which seemed large to Doob seemed like small and obvious steps to Shannon. Doob might not realize this for, how often if ever, would he have encountered a mind like Shannon’s?”
In a sense, leaving the dots for others to connect was a calculated gamble on Shannon’s part: had he gone through that painstaking work himself, the paper would have been much longer and appeared much later, both factors that would have likely diminished its reception.
It didn’t help that, in appearance, Wiener was easy to ridicule. Bearded, bespectacled, nearsighted, with red-veined skin and a ducklike walk, there was hardly a stereotype of the addle-pated academic that Wiener did not fulfill.
As one story had it, Wiener arrived at what he thought was his home and fumbled with his keys, finding that they would not fit in the lock. He turned to the children playing in the street and asked, “Can you show me where the Wieners live?” A little girl replied, “Follow me, Daddy. Mommy sent me here to point the way to our new house.”
the manuscript that formed the outlines of Wiener’s contributions to information theory was nearly lost to humanity. Wiener had entrusted the manuscript to Walter Pitts, a graduate student, who had checked it as baggage for a trip from New York’s Grand Central Terminal to Boston. Pitts forgot to retrieve the baggage. Realizing his mistake, he asked two friends to pick up the bag. They either ignored or forgot the request. Only five months later was the manuscript finally tracked down; it had been labeled “unclaimed property” and cast aside in a coatroom.
In April 1947, Wiener and Shannon shared the same stage, and both had the opportunity to present early versions of their thoughts. Wiener, in a moment of excessive self-regard, would write to a colleague, “The Bell people are fully accepting my thesis concerning statistics and communications engineering.”
“Wiener, in a sense, did a lot to push the idea of cybernetics, which is a somewhat vague idea, and got a lot of worldwide publicity for it,” said Stanford’s Thomas Kailath. “But that wasn’t Shannon’s personality at all. Wiener loved the publicity, and Shannon could not have cared less.”
But it still stands as an important moment in Shannon’s story. Shannon gave the impression of the carefree scholar—someone secure enough in his own intellect and reputation to brush aside the opinion of others. Wiener’s opinions and contribution mattered—but not because Shannon worried about who would or wouldn’t receive credit. Debates in his field mattered to him less for their opportunities to assert “ownership” of information theory than for their bearing on the substance of information theory itself. Credit, in the end, counted less than accuracy.