Desktop Linux suckage: introduction

Windows 2000
I remember around the year 2000, or whenever it was that Windows 2000 actually came out, I thought Microsoft were doing pretty well. Much as it pained me to say it, I also thought Windows was the best OS available. At the time, Windows was without question the best place to run Sun's JVM, Netscape was running aground, and IE was picking up steam to become the best web browser of the time.

Microsoft really looked like they were getting their act together. Each release of their OS had been convincingly better than the previous one, and Windows had reached a point where not only could I bear to use it, I actually quite liked using it. Outlook Express was fine for my mailing needs. Windows Media Player was fine for my mp3 needs (I had about five mp3s in those days, and as I recall, mainly used my MiniDisc player, a technology you youngsters probably don't even remember). I used my own editor, though it wasn't called Evergreen in those days, I used Sun's JVM, I used the application I was working on, IE was a nice fast web browser, and thus all my computing needs were covered.

I have always been a man of modest desktop application needs. Right now, I see Firefox, Evergreen, and Terminator. Rhythmbox is lurking, System Monitor is running, and that's it. Oh, there's a 24-hour digital clock in the corner.

But back to the year 2000. I'd been using Linux and Solaris on my desktop for the previous five years. Out of love, rather than because either of them was necessarily a good choice, let alone the best choice. Solaris was slow and, in practice, required over-expensive and under-performing hardware. Linux ran faster (even on the same hardware) but was often unreliable, and didn't have a decent JVM, which meant I couldn't actually get much work done.

As a user, too, the situation on Unix sucked. There was no software to run, decent or otherwise, and even the web browser, an application category invented on Unix, was falling too far behind the actual web. Netscape Navigator was bloating up with useless and ill-conceived crap, rather than actually improving in ways relevant to meeting IE's challenge.

So somewhat reluctantly, when a dual-processor Compaq machine (I told you this was a long time ago) running Windows 2000 appeared, I voluntarily became a Windows user. I was instantly happier and more productive than I'd been for years. I convinced myself that Microsoft had turned a corner, and still think of those days as Microsoft's "golden age", but in retrospect, it looks more like it was a temporary blip caused by the combination of Microsoft's successes and their competition's failures.

The stench of commercial Unix rotting from the inside out had just started to reach my nostrils, Linux still wasn't ready unless you had more free time than sense, and Mac OS was still the crappy 16-bit home computer OS from the 1980s rather than the "Unix for the rest of us" it is today.

What happened next
We all know what happened next. Apple reinvented Mac OS, and the web turned into a dangerous place Windows just wasn't ready for. Commercial Unix died, with the arguable exception of the server room. Linux? Well, Linux just kind of muddled along. It didn't get much better, but it didn't get much worse. It was fine for development work, but a pretty lame desktop. Many Unix lovers (and haters, because it's quite hard to be one without also being the other) settled on a Linux box for development work and a Mac for everything else.

That sums up how I saw the pragmatic developer/user's best choices in 2004, which is when John Gruber wrote Ronco Spray-On Usability, an article recently indirectly slashdotted via Why Free Software has poor usability, and how to improve it.

If you only read one of those articles, read Gruber's. It's the best-written and it's especially interesting because despite being written in 2004, it could have been written yesterday. It's also worth reading his follow-on article from back then, which turns out to say much of what I'm about to say.

Sometimes, you're done
If you've read my reviews of the last few versions of Mac OS, you may have noticed some disenchantment. Not that Mac OS is actively getting worse; it's just no longer getting much better, for me. And the few minor ways in which it's getting worse manage to cancel out the improvements to the extent that, for example, I've stuck with 10.4 on my MacBook Pro at work, despite (or because of) the fact that I've used 10.5 at home. The truth is, 10.3 was pretty much good enough. Good enough to make it hard to add USD120's worth of value.

If you're not familiar with Mac OS, but are familiar with Microsoft Office, you've probably known a similar feeling with Office "upgrades". The few people I know who've claimed to like a version of Word, for example, tend to like the oldest version they came into contact with. Everything since then has seemed superfluous to them; more trouble than it's worth. We know many developers/managers fail to realize that good software takes Joel's ten years, but I think even more have trouble accepting that there does eventually come a day when the software is "done", and most changes you can make from that point will make the product worse for as many people as they make it better.

The arrival of that day presents a major problem for a commercial entity; it's seriously non-trivial to come up with a significantly better way of doing something. Since check-as-you-type spelling checking, for example, what's really made a significant difference to word processing? Word processing is "done" (for now), and Office 2008 is living proof that Microsoft can't afford to accept that. Mac OS, seemingly, is "done" (for now, on the desktop), and Mac OS 10.5 is living proof that Apple can't afford to accept that.

[My dad just called to tell me I'm ignoring Time Machine, which is his new favorite thing. Maybe he's right, maybe he's not. The fact that Mac OS (or Word) is good enough that we can even seriously have this argument is telling. We could hardly be having the same argument about desktop Linux, could we?]

Amusingly, though "done day" would be a time to rejoice rather than worry for Free developers, there's no real sign we're getting anywhere close to that day, despite optimistic pronouncements to the contrary. Sure, Linux has got better since 2004. As a developer, I no longer want to use anything else. OpenJDK even has me actively excited about further improvements in the near future. Life on Linux is sweet for me, as a developer. But as a user?

Ubuntu 8.04
Non-developer progress has been slow, and patchy, and sometimes seems to go backwards. I'm starting to wonder if somewhere between Ubuntu 6.06 and Ubuntu 7.10, we didn't see Linux's "Windows 2000". Ubuntu 8.04 is a drag. So much stuff that used to work no longer works, I may as well be using 64-bit Vista (or whatever it is that's supposedly causing Windows users pain at the moment). I hear people arguing at work about whether Flash works or not. Audio. DVD playing. CD burning. Sleep/hibernation. Apple keyboards.

I've had problems with all of the above, minus audio. Despite audio being one of those in-jokes amongst Linux users, I've never actually had any trouble with it. (Though come to think of it, I've only ever even bothered to try on one of the ten or so Linux machines I've used in the past decade, so maybe that doesn't mean anything.) All of these things (minus Flash) worked in 7.10. All (minus Flash and sleep/hibernation) worked in 7.04. I had high hopes for 8.04. This was obviously going to be the one. A "long-term support" release (where "long-term" means 3 years) coming 6 months after a really strong release. What could possibly go wrong?

I upgraded, and a couple of months later, I stopped using my home Mac. Time, I thought, to make Linux my OS of choice not just for development.

It's interesting to reflect, at this point, upon the fact that I eagerly upgrade my Linux box every six months. Because the new stuff is so great? No. Because the old stuff is so bad. There's always something that's causing me enough pain I want to upgrade to make it go away.

And that would be fine, if upgrading didn't always bring new pain. In the next few posts, I'll talk about how it felt to finally switch to Linux for all my desktop needs.