After reading the Structure of Digital Computing, you will learn the answers to the following questions:
How would you explain the last 50 years of computing if you had only 90 seconds? (Chapter 1)
What is commoditization and why is it so important to understanding computing trends? (Chapter 2)
How can you distinguish important technical advances in computing from market clutter? (Chapter 3)
Do new computing technologies generally take 1 year, 2 years, 5 years, 10 years, or longer to develop? (Chapter 4)
What is big data? (Chapter 5)
The Structure of Digital Computing takes a fifty year perspective on computing and discusses what is significant, what is novel, what endures, and why it is all so confusing. The book tries to balance two point of views: digital computing as viewed from a business perspective, where the focus is on marketing and selling, and digital computing from a research perspective, where the focus is on developing fundamentally new technology.
I'm a faculty member at the University of Chicago and a Partner of Open Data Group. At the University of Chicago, I'm the Director of Informatics at the Institute for Genomics and Systems Biology, a Senior Fellow at the Computation Institute, and a Professor in the Division of Biological Sciences. I also founded Open Data Group in 2002, which since then has been one of the leaders in building predictive models over big data.