Language-induced brain damage is better than the alternative

Whenever old uncle Edsger had had a skinful, he'd rant about the harmful effects of bad languages on programmers. As a kid, I was never particularly convinced. He was too old to be all that concerned about C, and Ada hadn't even been invented yet, let alone Java, so his rants about Fortran, PL/I, Cobol and APL all seemed comically anachronistic. The languages he'd talk about were all pretty moribund anyway, at least for the purposes of a software engineer (as opposed to, say, a physicist).

BASIC, his other bĂȘte noire, never really seemed that bad. I grew up with a pretty good BASIC, the main deficiency of which seemed to be the lack of a garbage collected heap (except for strings, which were special). Even as a kid, it was pretty clear that fixed-size arrays were evil, so it was distressing that one's language forced one into the habit. But to me, that disproved Ed's claim: I was clearly a child of the BASIC era, and yet wasn't I sophisticated enough to recognize the problems? Wasn't this ability to recognize the flaws of one's language almost a test of one's latent aptitude, and thus useful in distinguishing those with real potential from those without?

In the years since, I've used a lot of languages. It's hard to imagine a well-rounded programmer who hasn't been exposed to an assembly language, a Lisp, Haskell or an ML, SQL, C, C++, and a Java-like managed language. And it's probably a shame too few encounter a Smalltalk. Even if you don't like these languages, and even if you wouldn't use them commercially, I think they each influence the way we think about computation and programming. And I think that that makes us better at doing our jobs, regardless of which language we're actually using.

(I deliberately omitted logic programming languages -- both deductive and the even less common inductive -- because if they did have an effect on me or my thinking, I've absolutely no idea what it was, and if they didn't I've absolutely no idea what I've missed.)

So it seems to me like there's a trade-off. Yes, learning a new class of language will change the way you think, but it will be both for better and worse. I don't think you can avoid this, and I think that deliberately remaining ignorant is worse than just accepting the mental scarring as a fact of life. Hell, I even think that learning absolutely appalling languages like Ada, S, and Javascript is an important learning experience. Those who cannot remember the past are condemned to repeat it.

But what I think is really interesting, and another reason it was hard to believe Ed's claim, is that pretty much by definition you can't see the damage a language does to you as clearly as you can see the good. You're likely to remember that language X taught you Y, but you don't even know that it failed to expose you to Z. So back in my BASIC days, I never bemoaned the lack of a sequence type or a map type. I almost missed the former, but would have been over-specific in my demands: I wanted to dynamically size arrays. What I thought I wanted was something like C's realloc(3), not C++'s std::vector. It wasn't until I was a C programmer and had realloc(3) that I realized how small an advance that is, and it wasn't until I was a C++ programmer that I realized that, really, I wanted a managed heap. (Not always, of course, because someone has to implement the managed language's runtime, but another thing that learning plenty of languages teaches you is the importance of always using the highest-level one you can afford for any given task.)

I was reminded of this topic recently when someone sent round a link to a Javascript x86 emulator. The interesting part to me was Javascript Typed Arrays. Javascript is very definitely in the class of languages that I'd never voluntarily use, but that doesn't mean I'm not interested to see what they're up to. And, as maintainer of an implementation of java.nio buffers, I was interested to see the equivalent functionality that Javascript users are getting.

If you don't know java.nio buffers, they're one of Java's many ill-conceived APIs. I say this as a fan of managed languages in general, and a long-time Java user, but having both used and implemented java.nio buffers, there's very little love lost between me and them. They're almost exactly not what I would have done. Surprisingly to me, given my admitted dislike of Javascript, Javascript's typed arrays are pretty much exactly what I would have done.

If I were asked to point to the most damaging design error in java.nio buffers, it would be one that I think was a side-effect of the kind of brain damage that C causes. Specifically, I contend that C programmers don't usually have a clear mental distinction between containers and iterators. I think that was one of the things that C++'s STL really taught us: that containers and iterators (and algorithms) are distinct, and that it's important to maintain these distinctions to get a high-quality library. The design of ICU4C suffers greatly from an ignorance of this idea (ICU4C is the C/C++ equivalent of the heinous java.text classes and such all-time API war crimes as java.util.Calendar, brought to you by the same people).

Java programmers ought not to be ignorant of this important lesson, but it took two attempts to get half-decent collections in the library (three if you count the late addition of generics), and iteration has been such a dog's breakfast in Java that I don't think the lesson to students of Java is nearly as clear as it is to students of C++.

(Dog's breakfast? Enumeration versus Iterator versus int indexes; raw arrays versus collections; the awful and verbose Iterator interface; and most of all the modern Iterable turd which makes the "enhanced" for loop less generally useful than it should have been and encourages the confusion between collections and iterators because the modern style involves an anonymous and invisible iterator. From arguments I've had with them, I think those responsible were hampered by the brain damage inflicted C and their ignorance of C++, an ignorance of which they're bizarrely boastful.)

But java.nio buffers are far far worse. There, rather than offering any kind of iterators, the collections themselves (that is, the buffers) have an implicit position. (Buffers have other state that really belongs in an iterator, and that is inconsistently inherited by related buffers, but that's beyond the scope of this discussion.) You can simulate iterators by making new buffers (with Buffer.duplicate, say) but it's awkward and ass-backward, leading to ugly and intention-obscuring calling code, and leading to another generation of programmers with this particular kind of brain damage.

(At this point you might argue that the ideal model is one of collections and ranges rather than iterators, since C++ iterators do tend to come in pairs, and from there you might argue that a range is really just another way of expressing a view, and from there that a view is best expressed as a collection, and from there that the containers-are-iterators model I'm complaining actually makes sense. It's one of those "how did we get into Vietnam"-style arguments, where any individual step isn't entirely unreasonable in itself, but where the final result is utterly fucked. The problem here being not so much a land war in Asia but having all collections have an implicit position to support iterator-style usage. Which in practice means that you've got a new orthogonal axis of "constness" to worry about, and that it's a lot harder to share containers. It's actively working against what I think most people consider to be one of the big lessons Java taught us: design for immutability. In a functional language, always working with views and relying on referential transparency might be fine, but Java is not that language, and many of the mistakes in the collections API are, I think, down to trying to pretend that it is. Which I hope makes it clear that I'm not blaming C any more than I'm blaming Haskell: I'm just giving examples of mistakes caused by transferring concepts into situations where they no longer make sense.)

The Javascript DataView class is sorely missing from java.nio's mess too. It's a really common use case that's very poorly served by java.nio. My java.nio implementation has something similar internally, but it's really nice to see the Javascript guys exposing what appears to be a lean, sane, task-focused API.

I do think there's a really nice comparative programming languages book to be written, and I think one of the most interesting chapters would be the one about iteration styles. I don't know whether it's surprising that something so fundamental should differ so wildly between languages (I didn't even touch on the Smalltalk style, which is something completely different again from any of the styles I did touch on), or whether it's exactly in the fundamentals that you'd expect to find the greatest differences.

If this is the brain-damage uncle Ed was so keen to warn me about, all I can say is "bring it on!". As far as I can tell, more different kinds of brain damage seems to lead to better outcomes than staying safe at home with the first thing that ever hit you over the head.