Despite the fact that there's no _real_ reason to be apologetic, I also haven't yet reached the point in my career as a software developer where I've stopped apologizing for the fact that I have no "real" Computer Science Background. And/but that's exactly what draws me to books like "Understanding Computation" by Tom Stuart (O'Reilly, 2013). Stuart describes the books as for:
"...programmers who are curious about programming languages and the theory of computation, especially those who don't have a formal background in mathematics or computer science."
For a taste of what I mean, observe the following quote:
"In the end, syntax is only concerned with the surface appearance of programs, not with their meanings."
If that made you smile just a little bit, because you want to peel the onion layers away and get at the semantic questions underneath... then this book is for you.
Now before we go any further -- a couple words on what this book is **not**. This is _not_ a book about software engineering. "Big O notation" does not make an appearance here in the text, not once. It isn't that we aren't concerned with "clean code" or efficiency or scalability or any of that -- but those are just implementation details that are not particularly interesting when you're otherwise diving into language semantics and finite automata.
Instead, the kind of questions this book poses are more like:
- What _is_ a computer?
- How do you create one?
- How does a computer _compute_?
- How would you go about designing a language?
- Are there limits to computation? What are they?
These are fascinating questions, and reading this book was one of the first times where I really understood what separates "computer science" from "software engineering". The book is a fascinating read, even if it occasionally lapses into sentences like this:
"The NFA class automatically takes account of free moves -- we can see that when our NFA is started in state 3, it's already possible for it to be in state 2 or 3 before it has read any input -- so we won't need to do anything special in NFASimulation to support them."
_Awesome_, but a little opaque. And/but, there's quite a bit of this; for a "simple" book about foundational computer science thinking, there's a lot of complicated material. ("But this is why you're here, right? To dip your toes into this particular pool?") It's often challenging, but also rewarding. (And there's plenty of humor along the way, if you're paying attention.)
A year from now, when I'm settling in to re-read this one, I probably won't remember the details about pushdown automata or what formalisms define a "universal Turing machine". However, I'm going to remember the in-depth discussions of static typing vs. dynamism and the trade-offs between them. I'll remember that we went through some engrossing exercises where we built up integers and arithmetic operators from scratch (even if I don't remember the details). I'll remember the flirtatious foray into language design. I'll remember The Halting Problem and most of the gory details around the limits of computation. And another thing I'll remember, statements like this:
"The purpose of an expression is to be evaluated to produce another expression; a statement, on the other hand, is evaluated to make some change to the state of the abstract machine."
If you're "doing software" and you don't have a degree in computer science, I'm going to urge you to read this book. Even if you can't follow everything, you'll walk away with a lot of insight. (I'm particularly fond of the first and last thirds.) And if you _do_ have a CS degree... Well, it's still worth skimming.