Review: "JavaScript: The Good Parts"

JavaScript has always been a bit of a mystery to me. Every time I've had any contact with it, I've been left wondering why anyone in their right mind would have anything to do with it. Let alone what kind of criminal mind would have originally conceived of such an abortion. I've never found (or heard tell of) anything good about it, other than its ubiquity. The fact that JavaScript is built in to every browser, but no alternative is. As far as I can tell, it's this ubiquity that helps JavaScript cling to life like a daglock, despite being little more than hair matted with dried shite.

I've always been curious, though, if only thanks to a weak belief in the "no-one's useless; they can always serve as a bad example" principle. So when I stumbled across a book with the title "JavaScript: The Good Parts", I thought I'd take a look.

It is, as you'd imagine, a very slim volume. It's about 140 pages, but even that exaggerates the amount of content. There's plenty of whitespace in the book (a good thing) and many things are taken too slowly (a bad thing, because many tricky things are taken too fast).

It starts off quite amusingly, but as early as page 3 I was starting to worry about the author. The verse numbering here is my own. In the original, this is one paragraph with no breaks:

1. The fashion in most programming languages today demands strong typing.

2. The theory is that strong typing allows a compiler to detect a large class of errors at compile time. The sooner we can detect and repair errors, the less they cost us.

3. JavaScript is a loosely typed language, so JavaScript compilers are unable to detect type errors. This can be alarming to people who are coming to JavaScript from strongly typed languages.

4. But it turns out that strong typing does not eliminate the need for careful testing.

5. And I have found in my work that the sorts of errors that strong type checking finds are not the errors I worry about.

6. On the other hand, I find loose type checking to be liberating. I don't need to form complex class hierarchies.

7. And I never have to cast or wrestle with the type system to get the behavior that I want.

You'll notice he starts off in verse 1 with a unsubstantiated pejorative that "fashion demands" strong typing. Not that programmers find it useful. No, "fashion" "demands" it.

He mentions the "theory" of "strong typing" like a creationist might mention the "theory" of "evolution", without bothering to point out which of the logical steps in his presentation of the "theory" he considers to be incorrect.

I assume it's the devil who puts errors in our code.

Actually, while we're going with this analogy, let's throw in the old "missing link" fallacy: verse 4 helpfully points out that stronger typing can't catch all errors. As if anyone ever claimed it would.

We're given no details of what proportion of errors are type errors, or whether it's more efficient (by any metric) to catch those errors via type signatures/annotations than via manually writing unit tests (which, of course, also contain errors).

In verse 5, we have a nice example of "argument by anecdote" made especially worthless by the rather vague term "errors I worry about". Note that he doesn't say "errors I make" or "errors commonly made by a cross-section of programmers". He says "errors I worry about". Does this mean he's happy to ignore type errors? Or that JavaScript gives him so many opportunities for new classes of error (like scope errors and binding errors and whitespace errors and RTTI errors and floating point arithmetic errors and equality errors and so on) that he just doesn't have time to worry about old-fashioned stuff like type errors that people writing code in sane languages have the luxury of concerning themselves with? Who knows?

Verse 6, without examples, would appear to be an admission that the author isn't any good at class-based OO programming. I'm not sure what the relevance is, other than a warning to the reader.

I realize this dissection of a single paragraph is of only tangential relevance to the book review this is supposed to be. But I'm sick of hearing this same old crap time and time again. You want to fight? Okay, let's fight. But let's fight with science, like grown ups. Fighting with religion like mystics is demeaning to us both, and fundamentally pointless. Show me the data, or shut the fuck up.

I left verse 7 in to show that I'd like to be above this kind of name-calling. He has a point, that most type systems aren't as expressive as we'd like at times. But there are notable exceptions, and whenever I see a "loose typist" say things like this, I think "there goes someone who had the misfortune never to have used Haskell or SML or whatever". Sure, all popular languages' type systems have their weaknesses, but I don't understand why anyone would assume that the solution is to throw the whole thing out. It's not like we have proof that the problem can't be satisfactorily solved. Or that, returning to the horse you thought I'd already flogged quite to death, that half an eye isn't better than no eye. Quite the opposite.

And it's not like anyone bothers to back up their anti-type system prejudice with any research. Do we stop testing because testing can't catch all errors, or do we try harder, and use other things to help us catch errors too?

The onus is on the "loose typists" to demonstrate that they're not crazy, which is exactly why they always resort to this kind of sophistry. It's so much easier.

(As a final aside before I return to the book, many of the best programmers I know understand programs via the types. This too is often overlooked by "loose typists", and sometimes leads me to wonder if there's a deeper difference in the way the two groups see, understand, and reason about the world. In which case demonstrating, say, a software-quality superiority of one style over the other would solve nothing.)

Anyway, ignoring the nonsense – bad sign though it is, when setting out one's stall – and turning the page, that's chapter 1 ("Good Parts") done with. On to "Grammar", a chapter which takes 15 pages, mostly consumed by "railroad diagrams" of the kind used in Pascal textbooks from the 1980s, bizarrely enough. Sadly, these are even harder to read than the traditional kind because there are no arrowheads on the arcs. You're faced with directed graphs where you have to work out the directionality yourself.

How very fitting.

I don't understand why some variant of BNF wasn't used. This book is aimed at people who can already program, so it wouldn't seem unreasonable to assume they can already read a variant of BNF, or learn one in a couple of paragraphs. (If you make it to the end, pages 125-135 repeat the diagrams from this chapter, to no particular advantage.) But don't skip this chapter: in between the diagrams there are many important gotchas revealed. If you noticed me talk about "whitespace errors" earlier, for example, read the section on "return" and weep. (Though note that the text here is misleading. There's a clearer and more accurate explanation later in the book. So on second thoughts, maybe you should skip this chapter after all.)

Skipping ahead to page 36, we learn that the one thing JavaScript appears superficially to have got right, C-like syntax, is also fundamentally broken. Those curly-brace delimited blocks don't introduce a new scope. Strangely, the chapter spends more time on currying and memoization than it does on this, which is one of the things so broken about JavaScript that I'd heard of it even before reading the book. Currying and memoization are presented seemingly for their coolness rather than their usefulness to JavaScript programmers. If you want to be a functional programmer, I can't think of a worse language than JavaScript. And if you must run in a browser, real functional programmers are smart enough to write SML-to-JavaScript compilers in Haskell anyway. (Given a decade and a handful of PhD students.)

Chapter 5 talks about how JavaScript's prototype-based system of inheritance is "more expressive" than the usual class-based systems. Without, of course, dwelling on why class-based systems are more common. Or why previous prototype-based systems (NewtonScript, say, or Self) never went anywhere. No. What we have instead is a section that explains how to imitate class-based inheritance (but pointing out that it doesn't really work out thanks to some bad decisions in JavaScript's design), a so-called "functional" scheme whose purpose is seemingly to paper over the lack of privacy in JavaScript, and a presentation of mixins by a JavaScript-specific name ("parts").

The "Arrays" chapter seems to claim that arrays are really just maps, and hence slower than you'd imagine an array to be, but surely that's a quality of implementation detail? Surely an implementation is free to special-case the (presumably common) case where the keys are consecutive (or at least dense) integers? The "delete" operator is weird (I don't like Python's either), while the "splice" method just seems badly-named. The "splice" method also seems like a good example of how loose typing seduces people into making one method do too much. For an array "for in" is even more screwed up than we've already been shown it is for objects. Strangely, JavaScript seems to have no range type or literals, so you're reduced to:

var i;
var v;
for (i = 0; i < array.length; ++i) {
v = array[i];
// ...

Going back to the lack of block scope, you could move the "var v" inside the "for" loop, but it would mean the same as the code shown here, and thus be more confusing.

(You'll note I'm left speculating about a lot of things. The author doesn't really go into implementation issues, and when he does, he doesn't actually come across as someone intimately familiar with the implementations anyway.)

Page 61 contains the kind of code that makes you not want to work with its author. It shows you how to write a home-made heuristic test for whether a given object is really an array. Why, you might wonder, would someone who doesn't go in for type systems care? So they can "write functions that do one thing when passed a single value and lots of things when passed an array of values". Except these functions won't be 100% sure they've really got an array. Cunning plan, Baldrick.

There follows more wishful thinking along functional programming lines, leaving me honestly curious to know how much this crops up in the stuff JavaScript's actually used for. Or maybe JavaScript programmers spend most of their time fantasizing that they're actually programming in some other language. A language that doesn't suck.

Odd though it may seem to have a single 12-page chapter on regular expressions that doesn't really deal with language-specific quirks, there's one in this book. I doubt that anyone who didn't understand regular expressions going in could gain much from the presentation, and given there was nothing of value for people who already know regular expressions (beyond the fact of the Ruby-like slash-delimited syntax and the exact details of the three possible trailing flags), I'm not sure what purpose this chapter serves.

Chapter 8 is a nice brief summary of the built-in types' methods, though much space is wasted on implementing functions in terms of each other. Here's most of the author's explanation of push, for example:

Array.method('push', function () {
[this.length, 0].
return this.length;

I'm sure that's really helpful.

Some of the text in this chapter (which is otherwise pretty clear and to the point) is annoyingly vague, too. We're told, for example, that "shift is usually much slower than pop", but what does that mean? That some implementations back arrays with vectors but other ones use dequeues? That bad implementations use arrays and good ones dequeues? That standards-conforming ones use arrays and non-conforming ones use dequeues? Something else entirely? (You'll recall that we were told in the beginning of chapter 6 that JavaScript arrays are more like hashes, though there too we were left on our own as to what that really means.)

I mentioned earlier that JavaScript doesn't have a range type, which means that counted loops end up written out in longhand. In this book, they're even longer than usual, looking like this:

for (i = 0; i < 4; i += 1) {

Not because JavaScript doesn't have operator++. It does. But Crockford doesn't like operator++: they "have been known to contribute to bad code by encouraging excessive trickiness" and "are second only to faulty architecture in enabling viruses and other security menaces" (page 122). Is this true of JavaScript? We're given no reason to believe it is. There's a longer version of this same madness on page 112:

The increment and decrement operators make it possible to write in an extremely terse style. In languages such as C, they made it possible to write one-liners that could do string copies:

for (p = src, q = dst; !*p; p++, q++) *q = *p;

They also encourage a programming style that, as it turns out, is reckless. Most of the buffer overrun bugs that created terrible security vulnerabilities were due to code like this.

In my own practice, I observed that when I used ++ and --, my code tended to be too tight, too tricky, too cryptic. So, as a matter of discipline, I don't use them any more. I think that as a result, my coding style has become cleaner.

Allow that to sink in for a moment.

Nice way to destroy your own credibility, dude. I mean, there may be some terrible problem with JavaScript's ++ and --, but if there is, he's completely failed to tell us what it is. He's asserted that it's a bad thing in C, but this isn't a book about C. This is a book about JavaScript.

And, since I'm already shouting and red in the face, I'll add that what he says isn't true of C, either. It's not the increment of an index or pointer that's the problem: it's the lack of a bounds check before using the index or pointer. If you're not sure who to believe, Mr Published Expert or me, Random Internet Guy, ask yourself this simple question: would an incorrect C program with a buffer overrun be fixed by switching from "++" to "+= 1"? Would the buffer overrun have been less likely in any way, shape, or form if the programmer had used "+= 1" instead of "++"? Of course not.

By this kind of bogus "logic", I shouldn't use the word "gift" in English, because "Gift" means "poison" in German, and, as you know, most of the poisonings that cause terrible illnesses in people are due to poison.

I haven't been subjected to such utter gobshite in a computer book since I gave up on Herbert Schildt.

If you've got a stomach for tripe, and you still think you might like this book, I'd recommend a glance at appendix E, which contains an implementation, presumably Crockford's, of a JSON parser. There's a "break" indented 9 levels and inside a loop made more readable by saying "i += 1" instead of "++i", of course; along with an assignment and a "delete" statement in another loop, both indented another 9 levels. I'll grant you it's a crude measure, but as far as style goes, deepest level of indentation is a pretty reliable indicator of a programmer's taste, and this man has absolutely none.

In fact, if you're seriously considering wasting money on this book, I strongly recommend you look at the last page of content (page 145), and the function on it. If you think the opinions of the man who wrote that code in a book primarily about style are worth your time, you need more help than any book can offer. There may be parts of the function that are, thanks to JavaScript's awfulness, unavoidably bad. But there are parts that could be trivially improved. (See the big comment for one hint; never say in comments what you can say in code!) And there are other seemingly obvious improvements that if they're not valid, it would really be worth explaining why not. For example: the author repeatedly tells us that JavaScript's for-each is broken and you need to make a hasOwnProperty test too, and the author has a huge hard-on for functional programming in all kinds of places where it doesn't help, but he won't write a higher-order forEach function?


I still think the idea behind "JavaScript: The Good Parts" was a good one, but the implementation was flawed, and the author was perhaps not a man suited to the task. The language itself may or may not actually have enough good parts to leave a working language behind if you really did avoid the bad parts, but Douglas Crockford is not a man of sufficient perspicacity for us to find out.

If this book had been a blog post somewhere, I could have happily passed over it with little more than a "meh". But the thought that those bastards over at O'Reilly killed trees for this, and unknowing punters like you or me give them money for it... Unrelated to this, someone at work the other day claimed that O'Reilly hadn't printed a decent book in the last ten years, and there was general agreement. I asked that "Java Generics and Collections" be considered the sole exception, but it's bumf like this that's ruined/ruining O'Reilly's reputation.

So what did I learn about JavaScript? That I'm never going any closer to it than the sensible end of a AnythingButJavaScript-to-JavaScript compiler. And I shall do my best to avoid even that.


Farewell to Java 5?

You've probably read that Mac OS 10.6 has finally made Java 6 the default, and you may also have read that there are both 32-bit and 64-bit JVMs. (That latter part will only seem surprising to you if you'd been using Java 6 on 10.5.)

What I didn't know until I read Things Removed in Snow Leopard was that Java 1.4 and Java 5 have actually been removed from 10.5. (Cocoa-Java has been removed too, but we were told to stop using that years ago, and we know that what Apple deprecates in one version, Apple tends to remove in the next.)

If you've been paying attention to Ubuntu 9.10 – which I hadn't, but someone else pointed this out – a quick sun-java5-jre package search shows that while Ubuntu 9.04 ("jaunty") offered the sun-java5-jre package, 9.10 ("karmic") doesn't. Both have Java 6 though: compare the sun-java6-jre package search.

In the Ubuntu case, you'll still be able to install a Java 5 JDK/JRE downloaded direct from Sun, but in terms of ease (and likely default), it looks like we're finally entering a Java 6 world.

The people least likely to be happy about this are those running Java applications on a Unix box, displaying on a Cygwin X11 server. Sun's Java 6 and Cygwin's X11 server really don't get on well.

Mac OS 10.4 users might not be too happy as apps start requiring Java 6 since for them that means an OS upgrade: there are no practical third-party JVMs they can use. And for some, an OS upgrade will entail a hardware upgrade because 10.6 is the first version that doesn't support PowerPC. But now these users are two major OS releases behind, Apple's "deprecate one, remove one" tempo means lots of stuff's going to start breaking for them anyway.

I'll be curious to see what happens to the Omni Software Update Statistics over the next couple of months. For 10.5 to have only overtaken 10.4 in 2009-02 is quite shocking for a group so traditionally upgrade-rabid as Mac users. (Though these numbers aren't necessarily representative of all Mac users.)