Tom Lee's Reviews > Turing's Cathedral: The Origins of the Digital Universe

Turing's Cathedral by George Dyson
Rate this book
Clear rating

's review
Dec 16, 2012

really liked it

I keep this photo over my desk at work. I think it looks a bit like a microscopic close-up of a drop of milk, or maybe a bacterial colony. In fact it's a shot of the Trinity Test, the planet's first atomic detonation. To me, this event and the context surrounding it are the most fascinating and amazing chapter in all of human engineering: in a panicked fight against evil, a collection of human intelligence was assembled that, through sheer intellectual might, wielded abstract mathematics and applied engineering to bend reality in an astounding new way. It's hard to imagine an engineering problem with such dizzying historical, moral and political consequences.

George Dyson has delivered a fascinating and flawed book that connects this project to my own profession: digital computing. He tells the story of Princeton's Institute for Advanced Study and its quest, led by the brilliant John Von Neumann, to build one of the world's first electronic computers, a project that was birthed by -- but, Dyson argues, destined to be even more transformative than -- the US military's atomic weapons program.

Dyson is the son of Freeman Dyson, and this is his greatest asset: he actually grew up among these brilliant minds. This puts him in an incredible position to describe what these men and women were like, and he does a very fine job. Von Neumann's own reticent, complicated brilliance leaves him something of an enigma, but Dyson conveys this well. And, although generally shorter, the portraits Dyson draws of Julian Bigelow, Alan Turing, Kurt Godel, Stan Ulam and Klari Von Neumann -- all of whom (astoundingly) were personally involved in this story in one way or another -- are fascinating, inspiring and heartbreaking.

But this book has problems.

I'll start with a quibble. The beginning is pretentiously overloaded: it's hard to imagine why a reader would want or need an explanation of William Penn's immigration and the events that led to George Washington once marching through what would become the IAS's backyard. But the real sins occur later in the book.

The book itself diagnoses Dyson's failings via a quote from Turing on page 263: "I agree with you about 'thinking in analogies,' but I do not think of the brain as 'searching for analogies' so much as having analogies forced upon it by its own limitations." Dyson doesn't take this limitation seriously. Having ably and charmingly described the creation of general-purpose digital computing, Dyson is incapable of critically evaluating the musings of its creators. Having built the a-bomb, these inventors -- understandably -- could be similarly oblivious as they applied computational metaphors to problems of biology and the mind.

These can be helpful conceptual frames, but Dyson is not equipped to see their limitations, or to acknowledge the modest returns they have yielded over a subsequent half-century of investigation. The stories he tells are instead about avenues of investigation cut short by untimely deaths, the military industrial complex or short-sighted academic administrators. He doesn't acknowledge that Barricelli's ideas about evolutionary algorithms, for instance, have continued to be studied, and have become a useful but far-from-universal (or life-creating!) technique.

By the end, Dyson has descended into mysticism. He misreads Turing's discussion of o-machines as a tragically unrealized proposal rather than a not-implementable thought experiment deployed for theoretical ends. He thinks search engines and "Web 2.0" are evidence of the kind of emergent properties associated with guesses about the spontaneous origins of consciousness. He points to the complexity of online social networks as exemplars of new forms of computation, subtly implying that this may have philosophical significance (without bothering to ask himself what, then, a market economy, postal system or beehive might be computing). He genuinely seems about half-convinced that extraterrestrials have transmitted themselves digitally into our computer networks, where they reside, hidden. On page 293 he visits with an elderly Edward Teller, to whom he explains this last theory. Teller, gently and sensibly, suggests that Dyson write a science fiction novel instead of his current project. It's good advice.

Certainly, others have made these mistakes before. One can hardly blame the creators of this incredibly powerful technology for optimistically imagining applications beyond its eventual reach. A great example comes in Chapter 9, which tells the story of the birth of computational meteorology. The meteorological status quo felt that their discipline was destined to remain an art; some upstarts felt that computational approaches were the path forward. The latter camp, with Von Neumann as their midwife, were thoroughly vindicated both in their own time and the decades since. Yet one ought to note Von Neumann's triumphalist predictions that, once weather systems could be predicted, manipulating them would be trivial. He predicted weather control and meteorological warfare! This has proven to have been a wild overestimation of that problem's tractability. Yet as the book progresses, Dyson takes the IAS staff's increasingly implausible speculations about consciousness and man's eventual subservience to machine and spins them out through his own, much-less-grounded imagination (Cory Doctorow thinks this is all great stuff, by the way).

The basic problem is that Dyson doesn't truly understand much of the technology he's writing about. His grasp of technical detail is often very good for a non-engineer. But he lacks the virtuosic comprehension that made the individuals at the heart of this story so remarkable, and which is a prerequisite for the kind of speculation he wishes to indulge in.

Let me be quick to add that I don't have that kind of virtuosity -- aside from a general lack of brilliance, the demotion of mathematics from computer science's essential foundation to its mere supplement (perhaps inevitable as CS's complexity has increased) arguably makes it much harder to achieve that kind of understanding these days -- but the big picture should be evident to anyone who's read even a little Hofstadter. The conceptual story here is about abstraction, the Unreasonable Effectiveness of Mathematics, and the doors opened by driving computation to a rate that can achieve results that are unattainable through more elegant and precise theoretical methods*. Instead Dyson often gets bogged down in meaningless errata about dimensionality, floating point arithmetic and whether data is represented spatially or temporally. The point of the story is that none of this matters! But Dyson doesn't grasp this. On pages 255 and 256, in particular, it becomes clear that the universality of the Turing Machine -- the whole point of the damn idea -- is lost on him.

This is a shame, and it makes much of the book's end a waste. Things really start go off the rails in the chapter on Turing (the book is largely organized into chapters focusing on individuals, which proves to be a wise choice), though Engineer's Dreams, the chapter that follows it, is well worth reading for its portrait of Von Neumann's inability to confront death -- it's a highlight of the book -- even if it then descends into some of the book's most risible speculation. Klari Von Neumann's fate, explained at the book's very end, also packs emotional weight, though you'll have had to wade through a lot of nonsense to get there.

Still, this book was a pleasure, and I'm grateful to Dyson for his portrait of a remarkable time filled with remarkable people. Highly recommended as a history, but whatever you do don't take its analogizing and speculation seriously.

* to be fair, Dyson is actually very good on this last point
2 likes · flag

Sign into Goodreads to see if any of your friends have read Turing's Cathedral.
Sign In »

No comments have been added yet.