More on this book
Community
Kindle Notes & Highlights
by
George Dyson
Read between
May 23 - May 31, 2020
I am thinking about something much more important than bombs. I am thinking about computers.
Where does time fit in? Time in the digital universe and time in our universe are governed by entirely different clocks. In our universe, time is a continuum. In a digital universe, time (T) is a countable number of discrete, sequential steps. A digital universe is bounded at the beginning, when T = 0, and at the end, if T comes to a stop. Even in a perfectly deterministic universe, there is no consistent method to predict the ending in advance. To an observer in our universe, the digital universe appears to be speeding up. To an observer in the digital universe, our universe appears to be
...more
In answering the Entscheidungsproblem, Turing proved that there is no systematic way to tell, by looking at a code, what that code will do. That’s what makes the digital universe so interesting, and that’s what brings us here.
If it’s that easy to create living organisms, why don’t you create a few yourself? —Nils Aall Barricelli, 1953
A digital universe—whether 5 kilobytes or the entire Internet—consists of two species of bits: differences in space, and differences in time. Digital computers translate between these two forms of information—structure and sequence—according to definite rules. Bits that are embodied as structure (varying in space, invariant across time) we perceive as memory, and bits that are embodied as sequence (varying in time, invariant across space) we perceive as code. Gates are the intersections where bits span both worlds at the moments of transition from one instant to the next.
The existence of a fundamental unit of communicable information, representing a single distinction between two alternatives, was defined rigorously by information theorist Claude Shannon in his then-secret Mathematical Theory of Cryptography of 1945, expanded into his Mathematical Theory of Communication of 1948. “Any difference that makes a difference” is how cybernetician Gregory Bateson translated Shannon’s definition into informal terms.2 To a digital computer, the only difference that makes a difference is the difference between a zero and a one.
That two symbols were sufficient for encoding all communication had been established by Francis Bacon in 1623. “The transposition of two Letters by five placeings will be sufficient for 32 Differences [and] by this Art a way is opened, whereby a man may expresse and signifie the intentions of his minde, at any distance of place, by objects … capable of a twofold difference onely,” he wrote, before giving examples of how such binary coding could be conveyed at the speed of paper, the speed of sound, or the speed of light.
That zero and one were sufficient for logic as well as arithmetic was established by Gottfried Wilhelm Leibniz in 1679, following the lead given by Thomas Hobbes in his Computation, or Logique of 1656. “By Ratiocination, I mean computation,” Hobbes had announced. “Now to compute, is either to collect the sum of many things that are added together, or to know what remains when one thing is taken out of another. Ratiocination, therefore is the same with Addition or Substraction; and if any man adde Multiplication and Division, I will not be against it, seeing … that all Ratiocination is
...more
In March of 1953 there were 53 kilobytes of high-speed random-access memory on planet Earth.
In 1936, logician Alan Turing had formalized the powers (and limitations) of digital computers by giving a precise description of a class of devices (including an obedient human being) that could read, write, remember, and erase marks on an unbounded supply of tape.
These “Turing machines” were able to translate, in both directions, between bits embodied as structure (in space) and bits encoded as sequences (in time).
Turing then demonstrated the existence of a Universal Computing Machine that, given sufficient time, sufficient tape, and a precise description, could emulate ...
This highlight has been truncated due to consecutive passage length restrictions.
Random-access memory gave the world of machines access to the powers of numbers—and gave the world of numbers access to the powers of machines.
The year 1953 was the first one in which more than $1 million was spent on guided missile development by the United States. “Guided” did not imply the precision we take for granted now. “Once it was launched, all that we would know is what city it was going to hit,” von Neumann answered the vice president in 1955.
Three technological revolutions dawned in 1953: thermonuclear weapons, stored-program computers, and the elucidation of how life stores its own instructions as strings of DNA.
On April 2, James Watson and Francis Crick submitted “A Structure for Deoxyribose Nucleic Acid” to Nature, noting that the double helical structure “suggests a possible copying mechanism for the genetic material.”
Biological organisms had learned to survive in a noisy, analog environment by repeating themselves, once a generation, through a digital, error-correcting phase, the same way repeater stations are used to convey intelligible messages over submarine cables where noise is being introduced.
What could be wiser than to give people who can think the leisure in which to do it? —Walter W. Stewart to Abraham Flexner, 1939
Since the time of Archimedes and his siege engines, military commanders had brought in the mathematicians when they needed help.
According to Newton and Galileo, the path of a projectile was calculable, but in practice it was difficult to predict the behavior of a shell in flight.
One of his recruits was Norbert Wiener, a twenty-four-year-old mathematical prodigy well trained after two years of postdoctoral study in Europe, but socially awkward and discouraged by the failures of his first teaching job. Even the army had rejected him, for poor eyesight and an inability to fire a rifle or maintain control of a horse.
Veblen had gathered a community that would redefine American mathematics in the years between World War I and World War II.
Göttingen, Berlin, Paris, and Cambridge were the centers of the mathematical world, while Harvard, Chicago, and Princeton were still far from catching up. Veblen returned to Princeton determined both to replicate the success of the European institutions and to recapture some of the informal mathematical camaraderie of the Proving Ground.
“The way to make another step forward,” he suggested, “is to found and endow a Mathematical Institute. The physical equipment of such an institute would be very simple: a library, a few offices, and lecture rooms, and a small amount of apparatus such as computing machines.”
Flexner was not a scholar himself,” explained Klári von Neumann, “but had a very practical mind which conceived the idea that there should be a place where men whose only tools of work were their brains could spend time entirely on their own,
‘Have you ever dreamed a dream?’
The book, expanding upon the Rhodes lectures Flexner had delivered at Oxford in 1928, gave a depressing account of higher education in America, concluding with a call for “the outright creation of a school or institute of higher learning” where “mature persons, animated by intellectual purposes, must be left to pursue their own ends in their own way … be they college graduates or not.” Flexner argued that this “free society of scholars” should be governed by scholars and scientists, not administrators, and even “the term ‘organization’ should be banned.”
“Mathematics is singularly well suited to our beginning,” he explained to the trustees. “Mathematicians deal with intellectual concepts which they follow out for their own sake, but they stimulate scientists, philosophers, economists, poets, musicians, though without being at all conscious of any need or responsibility to do so.” There were practical advantages to the field as well: “It requires little—a few men, a few students, a few rooms, books, blackboard, chalk, paper, and pencils.”
The Nazis launched their purge of German universities in April 1933, and the exodus of mathematicians from Europe—with Einstein leading the way to America—began just as the Institute for Advanced Study opened its doors. “The German developments are going bad and worse, the papers today wrote of the expulsion of 36 university professors, ½ of the Göttingen mathematics and physics faculty,” von Neumann reported to Flexner on April 26. “Where will this lead, if not to the ruin of science in Germany?”
Princeton, despite its role in the American Revolution, had become one of the more conservative enclaves in the United States, “a quaint and ceremonious little village of puny demigods on stilts,” as Einstein described it to the Queen of Belgium in 1933.
Princeton University professors referred to “the Institute for Advanced Salaries,” while Princeton University graduate students referred to “the Institute for Advanced Lunch.”
Flexner’s own tenure was short-lived. He started out determined to avoid “dull and increasingly frequent meetings of committees, groups, or the faculty itself. Once started, this tendency toward organization and formal consultation could never be stopped.”
He could not announce that the Institute was already engaged in behind-the-scenes support for work on atomic bombs, but he did announce the Institute’s unwavering support for “the critical study of that organized tradition which we call civilization and which it is the purpose of this war to preserve. We cannot, and in the long run will not, fight for what we do not understand.”
At the time of the founding of the IAS, mathematics was divided into two kingdoms: pure mathematics, and applied. With the arrival of von Neumann, the distinctions began to fall. “The School of Mathematics has a permanent establishment which is divided into three groups, one consisting of pure mathematics, one consisting of theoretical physicists, and one consisting of Professor von Neumann,” Freeman Dyson explained to a review committee in 1954.
A third kingdom of mathematics was taking form. The first kingdom was the realm of mathematical abstractions alone. The second kingdom was the domain of numbers applied, under the guidance of mathematicians, to the real world. In the third kingdom, numbers would assume a life of their own.
“The essence of his philosophical, scientific, and humanitarian heritage was to do the impossible, that which was never done before,” says Nicholas of what John learned from their father Max. “His approach was doing not just what was never done before but what was considered as impossible to be done.”
Budapest, the city of bridges, produced a string of geniuses who bridged artistic and scientific gaps.
“Johnny’s most characteristic trait was his boundless curiosity about everything and anything, his compulsive ambition to know, to understand any problem, no matter on what level,”
He had an ability, “perhaps somewhat rare among mathematicians,” explains Stan Ulam, “to commune with the physicists, understand their language, and to transform it almost instantly into a mathematician’s schemes and expressions. Then, after following the problems as such, he could translate them back into expressions in common use among physicists.”
Any subject was fair game. “I refuse to accept however, the stupidity of the Stock Exchange boys, as an explanation of the trend of stocks,” he remarked to Ulam in 1939. “Those boys are stupid alright, but there must be an explanation of what happens, which makes no use of this fact.” This question led to his Theory of Games and Economic Behavior, written with Oskar Morgenstern during the war years, with von Neumann giving the project his diminishing spare time and Morgenstern contributing “the period of the most intensive work I’ve ever known.”
He was drawn to “impossible” questions—predicting the weather, understanding the brain, explaining the economy, constructing reliable computers from unreliable parts.
“It was a matter of pride with him to consider the weightiest questions in the spirit of a simple puzzle,” says Klári, “as if he was challenging the world to give him any puzzle, any question, and then, with the stop-watch counting time, see how fast, how quickly and easily he could solve it.”
Edward Teller believed that “if a mentally superhuman race ever develops, its members will resemble Johnny von Neumann,” crediting an inexplicable “neural superconductivity,” and adding that “if you enjoy thinking, your brain develops. And that is what von Neumann did. He enjoyed the functioning of his brain.”19 If there wasn’t anything to puzzle over, his attention wandered off. According to Herman Goldstine, “nothing was ever so complete as the indifference with which Johnny could listen to a topic or paper that he did not want to hear.”
Theodore von Kármán—the Hungarian aerodynamicist who established the Jet Propulsion Laboratory in Pasadena, built the first supersonic wind tunnel, assumed the first chairmanship of the Air Force Scientific Advisory Board, and “invented consulting,” according to von Neumann—remembers how “a well-known Budapest banker came to see me with his seventeen-year-old son…. He had an unusual request. He wanted me to dissuade young Johnny from becoming a mathematician. ‘Mathematics,’ he said, ‘does not make money.’”
“The conciseness of the system of axioms is surprising,” comments Stan Ulam. “The axioms take only a little more than one page of print. This is sufficient to build up practically all of the naive set theory and therewith all of modern mathematics … and the formal character of the reasoning employed seems to realize Hilbert’s goal of treating mathematics as a finite game.”32
The mathematical landscape of the early twentieth century was dominated by Göttingen’s David Hilbert, who believed that from a strictly limited set of axioms, all mathematical truths could be reached by a sequence of well-defined logical steps. Hilbert’s challenge, taken up by von Neumann, led directly both to Kurt Gödel’s results on the incompleteness of formal systems of 1931 and Alan Turing’s results on the existence of noncomputable functions (and universal computation) of 1936. Von Neumann set the stage for these two revolutions, but missed taking the decisive steps himself.
Gödel proved that within any formal system sufficiently powerful to include ordinary arithmetic, there will always be undecidable statements that cannot be proved true, yet cannot be proved false. Turing proved that within any formal (or mechanical) system, not only are there functions that can be given a finite description yet cannot be computed by any finite machine in a finite amount of time, but there is...
This highlight has been truncated due to consecutive passage length restrictions.
The good news is that, as Leibniz suggested, we appear to live in the best of all possible worlds, where the computable functions make life predictable enough to be survivable, while the noncomputable functions make life (and mathematical truth) unpredictable enough ...
This highlight has been truncated due to consecutive passage length restrictions.
“The economy of the treatment seems to indicate a more fundamental interest in brevity than in virtuosity for its own sake. It thereby helped prepare the grounds for an investigation of the limits of finite formalism by means of the concept of ‘machine.’”
Von Neumann found himself at the center of a thriving mathematical community, assuming the role that Hilbert had played in the Göttingen of 1926. “I would come to Fine Hall in the morning and look for von Neumann’s huge car,” remembers Israel Halperin, a student in 1933, “and when it was there, in front of Palmer Lab, Fine Hall seemed to be lit up. There was something in there that you might run into that was worth the whole day. But if the car wasn’t there, then he wasn’t there and the building was dull and dead.”