Turing's Cathedral: The Origins of the Digital Universe
Rate it:
Open Preview
0%
Flag icon
Von Neumann’s project was the physical realization of Alan Turing’s Universal Machine, a theoretical construct invented in 1936. It was not the first computer. It was not even the second or third computer. It was, however, among the first computers to make full use of a high-speed random-access storage matrix, and became the machine whose coding was most widely replicated and whose logical architecture was most widely reproduced. The stored-program computer, as conceived by Alan Turing and delivered by John von Neumann, broke the distinction between numbers that mean things and numbers that do ...more
0%
Flag icon
As World War II drew to a close, the scientists who had built the atomic bomb at Los Alamos wondered, “What’s next?” Some, including Richard Feynman, vowed never to have anything to do with nuclear weapons or military secrecy again. Others, including Edward Teller and John von Neumann, were eager to develop more advanced nuclear weapons, especially the “Super,” or hydrogen bomb.
1%
Flag icon
Numerical simulation of chain reactions within computers initiated a chain reaction among computers, with machines and codes proliferating as explosively as the phenomena they were designed to help us understand.
3%
Flag icon
A digital universe—whether 5 kilobytes or the entire Internet—consists of two species of bits: differences in space, and differences in time. Digital computers translate between these two forms of information—structure and sequence—according to definite rules. Bits that are embodied as structure (varying in space, invariant across time) we perceive as memory, and bits that are embodied as sequence (varying in time, invariant across space) we perceive as code. Gates are the intersections where bits span both worlds at the moments of transition from one instant to the next.
3%
Flag icon
That two symbols were sufficient for encoding all communication had been established by Francis Bacon in 1623. “The transposition of two Letters by five placeings will be sufficient for 32 Differences [and] by this Art a way is opened, whereby a man may expresse and signifie the intentions of his minde, at any distance of place, by objects … capable of a twofold difference onely,” he wrote, before giving examples of how such binary coding could be conveyed at the speed of paper, the speed of sound, or the speed of light.3 That zero and one were sufficient for logic as well as arithmetic was ...more
3%
Flag icon
Von Neumann set out to build a Universal Turing Machine that would operate at electronic speeds. At its core was a 32-by-32-by-40-bit matrix of high-speed random-access memory—the nucleus of all things digital ever since. “Random access” meant that all individual memory locations—collectively constituting the machine’s internal “state of mind”—were equally accessible at any time. “High speed” meant that the memory was accessible at the speed of light, not the speed of sound. It was the removal of this constraint that unleashed the powers of Turing’s otherwise impractical Universal Machine.
4%
Flag icon
Three technological revolutions dawned in 1953: thermonuclear weapons, stored-program computers, and the elucidation of how life stores its own instructions as strings of DNA.
4%
Flag icon
The mechanism of translation between sequence and structure in biology and the mechanism of translation between sequence and structure in technology were set on a collision course. Biological organisms had learned to survive in a noisy, analog environment by repeating themselves, once a generation, through a digital, error-correcting phase, the same way repeater stations are used to convey intelligible messages over submarine cables where noise is being introduced. The transition from digital once a generation to digital all the time began in 1953.
4%
Flag icon
What began as an isolated 5-kilobyte matrix is now expanding by over two trillion transistors per second (a measure of the growth in processing and memory) and five trillion bits of storage capacity per second (a measure of the growth in code).16 Yet we still face the same questions that were asked in 1953. Turing’s question was what it would take for machines to begin to think. Von Neumann’s question was what it would take for machines to begin to reproduce.
4%
Flag icon
When the Institute for Advanced Study agreed, against all objections, to allow von Neumann and his group to build a computer, the concern was that the refuge of the mathematicians would be disturbed by the presence of engineers. No one imagined the extent to which, on the contrary, the symbolic logic that had been the preserve of the mathematicians would unleash the powers of coded sequences upon the world. “In those days we were all so busy doing what we were doing we didn’t think very much about this enormous explosion that might happen,” says Willis Ware.
10%
Flag icon
Von Neumann’s talents stood out, even in Budapest. “Johnny’s most characteristic trait was his boundless curiosity about everything and anything, his compulsive ambition to know, to understand any problem, no matter on what level,” Klári recalls. “Anything that would tickle his curiosity with a question mark, he could not leave alone; he would sulk, pout and be generally impossible until, at least to his own satisfaction, he had found the right answer.” He was able to disassemble any problem and then reassemble it in a way that rendered the answer obvious as a result. He had an ability, ...more
11%
Flag icon
Axiomatization is the reduction of a subject to a minimal set of initial assumptions, sufficient to develop the subject fully without new assumptions having to be introduced along the way. The axiomatization of set theory formed the foundations, mathematically, of everything else. An ambitious previous attempt, by Bertrand Russell and Alfred North Whitehead, despite 1,984 pages extending across three volumes, still left fundamental questions unresolved. Von Neumann started fresh. “The conciseness of the system of axioms is surprising,” comments Stan Ulam. “The axioms take only a little more ...more
This highlight has been truncated due to consecutive passage length restrictions.
11%
Flag icon
In his axiomatization of set theory, “one can divine the germ of von Neumann’s future interest in computing machines,” says Ulam, speaking with hindsight from 1958. “The economy of the treatment seems to indicate a more fundamental interest in brevity than in virtuosity for its own sake. It thereby helped prepare the grounds for an investigation of the limits of finite formalism by means of the concept of ‘machine.’ ”33 Von Neumann’s style was now set. He would approach a subject, identify the axioms that made it tick, and then, using those axioms, extend the subject beyond where it was when ...more
20%
Flag icon
“Formal logic has to be taken over by mathematicians,” Veblen had announced on New Year’s Eve 1924, when the plans for what would become the Institute for Advanced Study were first taking form in his mind. “There does not exist an adequate logic at the present time, and unless the mathematicians create one, no one else is likely to do so.”10 It was Gödel, above anyone else—and now directly above von Neumann—who proved Veblen’s instincts correct. In 1924 both von Neumann and Gödel were working on the logical foundations of mathematics, before Gödel’s incompleteness theorems brought the Hilbert ...more
This highlight has been truncated due to consecutive passage length restrictions.
28%
Flag icon
Shift registers, as Leibniz had demonstrated 260 years earlier, could perform binary arithmetic simply by shifting an entire row of binary digits one position to the right or left. Data were never transferred directly between adjacent toggles; instead, the state of each individual toggle was replicated upward into a temporary register, the lower register was cleared, and then and only then were the data shifted, diagonally, back down into the original register. There was no lower bound to how slowly the computer could be stepped through a sequence of instructions. Unlike the well-behaved ...more
32%
Flag icon
Von Neumann’s approach was to bring a handful of engineers into a den of mathematicians, rather than a handful of mathematicians into a den of engineers. This freed the project from any constraints that might have been imposed by an established group of engineers with preexisting opinions as to how a computer should be built.
60%
Flag icon
In our universe, we measure time with clocks, and computers have a “clock speed,” but the clocks that govern the digital universe are very different from the clocks that govern ours. In the digital universe, clocks exist to synchronize the translation between bits that are stored in memory (as structures in space) and bits that are communicated by code (as sequences in time). They are clocks more in the sense of regulating escapement than in the sense of measuring time. “The I.A.S. computing machine is non-synchronous; that is, decisions between elementary alternatives, and enforcement of ...more
This highlight has been truncated due to consecutive passage length restrictions.
62%
Flag icon
What about von Neumann’s question—whether machines would begin to reproduce? We gave digital computers the ability to modify their own coded instructions—and now they are beginning to exercise the ability to modify our own. Are we using digital computers to sequence, store, and better replicate our own genetic code, thereby optimizing human beings, or are digital computers optimizing our genetic code—and our way of thinking—so that we can better assist in replicating them? In the beginning was the command line: a human programmer supplied an instruction and a numerical address. There is no ...more
This highlight has been truncated due to consecutive passage length restrictions.