Despite the fact that there's no real reason to be apologetic, I also haven't yet reached the point in my career as a software developer where I've stDespite the fact that there's no real reason to be apologetic, I also haven't yet reached the point in my career as a software developer where I've stopped apologizing for the fact that I have no "real" Computer Science Background.[^1] And/but that's exactly what draws me to books like Understanding Computation by Tom Stuart (O'Reilly, 2013). Stuart describes the books as for:
...programmers who are curious about programming languages and the theory of computation, especially those who don’t have a formal background in mathematics or computer science.
In other words, people like me. The people that Eric Miraglia described as the "liberal arts majors drafted into web-developer service during the dotcom boom".[^2] Yes: the liberal artsy non-computer science degree holders that wound up doing computer sciencey type software work just the same. Smart people that nevertheless are exploring some of these concepts for the first time.
For a taste of what I mean, observe the following quote:
In the end, syntax is only concerned with the surface appearance of programs, not with their meanings.
If that made you smile just a little bit, because you want to peel the onion layers away and get at the semantic questions underneath... then this book is for you.
Now before we go any further -- a couple words on what this book is not. This is not a book about software engineering. "Big O notation" does not make an appearance here in the text, not once. It isn't that we aren't concerned with "clean code" or efficiency or scalability or any of that -- but those are just implementation details that are not particularly interesting when you're otherwise diving into language semantics and finite automata.
Instead, the kind of questions this book poses are more like:
- What is a computer? - How do you create one? - How does a computer compute? - How would you go about designing a language? - Are there limits to computation? What are they?
These are fascinating questions, and reading this book was one of the first times where I really understood what separates "computer science" from "software engineering". The book is a fascinating read, even if it occasionally lapses into sentences like this:
The NFA class automatically takes account of free moves — we can see that when our NFA is started in state 3, it’s already possible for it to be in state 2 or 3 before it has read any input — so we won’t need to do anything special in NFASimulation to support them.
Awesome, but a little opaque.[^3] And/but, there's quite a bit of this; for a "simple" book about foundational computer science thinking, there's a lot of complicated material. ("But this is why you're here, right? To dip your toes into this particular pool?") It's often challenging, but also rewarding. (And there's plenty of humor along the way, if you're paying attention.)
A year from now, when I'm settling in to re-read this one, I probably won't remember the details about pushdown automata or what formalisms define a "universal Turing machine". However, I'm going to remember the in-depth discussions of static typing vs. dynamism and the trade-offs between them. I'll remember that we went through some engrossing exercises where we built up integers and arithmetic operators from scratch (even if I don't remember the details). I'll remember the flirtatious foray into language design. I'll remember The Halting Problem and most of the gory details around the limits of computation. And another thing I'll remember, statements like this:
The purpose of an expression is to be evaluated to produce another expression; a statement, on the other hand, is evaluated to make some change to the state of the abstract machine.
If you're "doing software" and you don't have a degree in computer science, I'm going to urge you to read this book. Even if you can't follow everything, you'll walk away with a lot of insight. (I'm particularly fond of the first and last thirds.) And if you do have a CS degree... Well, it's still worth skimming.
[^1]: The "full disclosure" footnote: I was briefly declared as a computer science major while in college. I performed... serviceably in an intro class (Common Lisp!) and in an "intro to AI" class (more Lisp!) but got ground to a frustrating halt in "Programming and Data Structures". I blamed Dr. Hunter, but I think I just wasn't ready to think that way. Call me a late bloomer. Call me a self-learner. Call me a frustrating bastard.
[^3]: Partly my own fault for doing so much of the reading as my too-late-night bedtime reading.
I like the idea of this book more than the book itself. Granted, Tate took on a daunting task: how do you introduce seven divergent languages with sevI like the idea of this book more than the book itself. Granted, Tate took on a daunting task: how do you introduce seven divergent languages with seven divergent styles and seven divergent intents in the space of one book? The mission is a good one at least: introduce apprentice or journeyman programmers to a diverse array of programming languages and styles to help thing break out of their comfortable little already-known toolkit.
The approach is at least a half-way decent one: introduce a language, give three days worth of lessons (plus homework) and then use that to bridge into the next language/style.
But it also comes across as very surface-level. Tate even comes right out and says in many instances that he is just barely scratching the surface of each language, that he is giving some language feature a barely cursory overview, or else leaving it out all together. So you wind up with your appetite whetted but without any satisfying take-away knowledge (e.g., my experience here with Scala and Clojure), or else you're turned off to that language all together (e.g., my experience with Io and Prolog)....more