Jenny Brown's Reviews > Turing's Cathedral: The Origins of the Digital Universe

Turing's Cathedral by George Dyson
Rate this book
Clear rating

by
4215702
's review

it was ok

This book is fatally marred by Dyson's failure to understand computer architecture. I note many reviewers assuming that they are confused because they are math phobic. But I was a programmer in the late 1970s and 1980s. I wrote in Assembly language and have read machine language (in hex) when debugging, so when I read Dyson's long passages of gibberish purporting to describe what is going on in a computer I knew they were just plain gibberish.

The stories about the people involved in the project were very interesting, as was the description of the environment--the IAS--and its politics, but Dyson's failure to explain the most important technical concepts to the reader (or to understand them himself) limit the books usefulness. These concepts are not all that complex. They were explained to a class of recent high school graduates in Tennessee back when I took my first computer class in 1979 and we all wrote a simplified machine language program for our final exam. So there isn't any reason that the intelligent people this book was designed for couldn't have had the architecture of a Van Neumann machine explained in more depth--in a way that would have made sense to them.

I found this book saddening because it made me really want to understand the technology whose description it butchers. I'd really love to know more about how the engineers built these early computers and how they differ from the ones in use today. I gleaned many facts from the mishmash presented here, for example that CRTs were used in place of what was later called "core" memory in slightly later computers.

But Dyson clearly doesn't understand what a register is, nor does he understand why shifting bits would do arithmetic and reading the same bits in a CPU provides the machine with its program. For that matter, he doesn't seem to understand what a program is. A simple explanation of what a computer algorithm is and why 17 instructions can lead to thousands of operations being performed would make long tracts of this book make sense, which they currently don't.

But statements like one claiming that the CRT screen on a PC represented a CPU buffer were sheer imbecility. (The CPU on that computer functions as an output device, similar to the tape or punched paper on the early computers.) And there were dozens more of these uninformed statements.

Bottom line: If you read this book and are confused you are getting the authors message, since he, too, was terminally confused by the technology he was attempting to describe.
78 likes · flag

Sign into Goodreads to see if any of your friends have read Turing's Cathedral.
Sign In »

Reading Progress

Started Reading
August 14, 2012 – Shelved
August 14, 2012 – Finished Reading

Comments Showing 1-14 of 14 (14 new)

dateDown arrow    newest »

message 1: by Marie (new)

Marie Such a good point! If the author doesn't understand, how am I going to get it? Especially as I don't know all that much about computers in the first place—although that bit about "the CRT screen on a PC represented a CPU buffer" sounds pretty absurd to me as well, even with my limited knowledge.

Thanks for the review! I thought the book sounded interesting, but that kind of issue would have driven me nuts.


Barry I don't think you understood the role of cathode ray tubes as memory, with the correctly described side effect of creating a view of the memory contents. 'Imbecility' is a poorly-judged complaint when you're wrong. Admittedly Dyson is vague, but makes no actual incorrect statement (that I cold find, at least), about the role of bit shift (multiply/divide two) in binary arithmetic.


Jenny Brown I understood perfectly the way cathode ray tubes were used as a primative form of memory in early computers. But the CRT attached to a 1980's PC is not displaying memory. It's outputting the contents a buffer. In my youth I wrote assembly code that modified CRT buffers. Dyson appears to have little understanding of the brilliant conceptual innovation at the heart of machines that use the same address space for program and data.


Michael Berman Thank you, thank you, thank you. I'm a bright enough person--not a coder--and I've read extensively in science and technology. I'm finding this book incomprehensible. I really hoped to *understand* what the first computers did, and, halfway through the book, I'm no better informed than when I started. I'm giving myself permission to not finish this one.


message 5: by Ted (new) - rated it 4 stars

Ted Tschopp I think you should give the book a re-read based on this statement:

"But statements like one claiming that the CRT screen on a PC represented a CPU buffer were sheer imbecility. (The CPU on that computer functions as an output device, similar to the tape or punched paper on the early computers.) And there were dozens more of these uninformed statements."

The MANIAC used CRTs for RAM.

See the picture on this web page.

http://paw.princeton.edu/issues/2012/...

http://paw.princeton.edu/issues/2012/...

That's Neumann and those shiny metal tubes with a hole at the end are the CRTs that it used as RAM.

See the wiki page for the Williams tube for an exposed tube

http://en.wikipedia.org/wiki/Williams...


Jenny Brown Ted,

I agree that the earliest computers used a CRT as memory. But modern PC monitors don't, and Dyson specifically said that PCs do the same thing. That was my point.


message 7: by Toni (new)

Toni If you would like to know more about Alan Turning, read Andrew Hodges book, updated in 2014, with an explanatory preface. "Alan Turing: The Enigma." Also it's much more readable.


Vivian Thank you for this review... I DID think I just wasn't following the math, so couldn't grasp the concepts of the computer architecture. Glad to know it's not just me!


message 9: by Gy (new) - added it

Gy Like your review, but despite of that I'm about to read it. Sometimes, we learn more from others mistakes, and that I'm claiming without any malicious thoughts or intent. Thanks a lot.


Polyglot27 I totally agree with your review. In the end I just skipped over the technical gibberish and read about the lives of those who were involved. The author seems to have a problem with time. It was really annoying the way he jumps about back and forth. In one paragraph he could go from 1953 to 1942 then to 1955 and back to 1939.


message 11: by Gc (new)

Gc Here's Dyson's entire paragraph re CRTs and memory buffers (in Chapter 8):

In modern (or once-modern) computers, a cathode-ray tube (CRT) displays the state of a temporary memory buffer whose contents are produced by the central processing unit (CPU). In the MANIAC however, cathode-ray tubes were the core memory, storing the instructions that drove the operations of the CPU. The use of display for memory was one of those discontinuous adaptations of preexisting features for unintended purposes by which evolution leaps ahead.

Dyson does not say modern PCs "do the same thing" (i.e., use the CRT as memory).

We might better understand Dyson's brief description of modern PC displays and memory if we first understand what a "frame buffer" is (http://www.webopedia.com/TERM/F/frame...).


message 12: by Brian (new)

Brian Dear In my own upcoming book I have a chapter about the use of cathode-ray tubes as memory storage, but the people I interviewed called them "storage tubes" (they used ones made by Raytheon, in the ILLIAC era). These were not *display* CRTs but just tubes that could store digital memory; they needed hardware that could "read" the memory and send it out as a video signal to a conventional CRT which could then display it.


message 13: by Nilo (new)

Nilo De Roock I planned to read this... but after reading your review I decided not to. Time is too valuable.


message 14: by Jean (new) - rated it 4 stars

Jean "Sheer imbecility" , in my humble opinion, is to expect this book to be some kind of technical reference for the ENIAC, or a text book on computer architecture.


back to top