The article that prompted me to write these rants, "Why Free Software has poor usability, and how to improve it", failed to mention Linux's lack of a Steve Jobs. This is an important omission. The Linux kernel has its Steve Jobs. The question we have to ask ourselves is whether it's just a coincidence that the Linux kernel is the poster child for successful Free software.
What does the Linux desktop have? The Linux desktop has nothing. No person, not even a committee. Just the GNOME HIG.
I suggest you take a look at the GNOME HIG, in all its barely-changed-since-2001 glory. In particular, compare its breadth, depth, and quality to Apple's or even Microsoft's equivalents. You might think that the Apple and Microsoft guides go too far. Microsoft's one is especially long. But you could hardly argue that developers of GNOME applications are doing such a good job they don't need at least as much help as Mac and Windows developers. At least Apple and Microsoft seem to acknowledge the extent of the problem.
At least Apple and Microsoft seem to understand that developers need help: developers need better languages than C, developers need higher-level toolkits that do ever more of the stupid work that doesn't distinguish one application from another, and developers need documentation that helps them with the stuff that the toolkit can't. (Note the order in which I mention these things. There's significance to it.)
If you hang around Bugzilla enough, you'll start to notice "need direction from the HIG" as a polite way of saying "will not fix". You may as well rely on Santa Claus to buy your wife an anniversary gift as rely on the GNOME HIG being updated to reflect modern application trends.
Ideally, the HIG would be a strong document and there would be libraries that made it easier to write good applications than to write bad ones, and we'd be using a language well-suited both to that library, and to application development in general.
What we actually have is the GNOME HIG, GTK+, and C. Oops.
For the Linux desktop developer, everything is such hard work that it's not hard to see why no-one ever gets round to the all-important polish. Or why no-one has the agility to experiment until they get their UI right, rather than slapping together the first thing they think of, getting it half working, and being too scared to touch the tower of cards lest it topple before their eyes.
Do Mac programmers have the same problem? No. Quite the opposite. Cocoa does all the stupid crap for them, so instead of everyone writing their own code to save and restore window position (et cetera) programmers and users can just take that basic stuff for granted, and spend development time on the stuff relevant to their application. Their language is their weakest link, being Objective C, but since its major problem is that, when Cocoa doesn't already offer what you need, you end up falling back to C, it's obvious that that's a net win compared to the usual GNOME situation. Objective C also recently gained garbage collection. GNOME has a weak toolkit (GTK+) and a hopeless revenant language (C).
There are GTK+ bindings to other languages, but in my experience, they're fragile, incomplete, outdated, and/or unsupported. The three serious contenders for an alternative to C, namely C++, C#, and Java, are probably all too politically contentious to succeed without causing serious upset. (The amusing part being that most of the "C++ is too complicated", "C# is too evil/case-insensitive/Microsoft-encumbered", "Java is too slow" crap is likely to come from people wholly uninvolved in, and probably incapable of, writing software). I'm hoping OpenJDK might improve matters here, but you'll forgive me if I don't risk holding my breath.
I realizing I've been ignoring KDE (based on Qt and C++), but they're just a somewhat better implementation of an orthogonally awful philosophy. "Treat your users like technology fetishists who want to piss away their lives fiddling with myriad options in an effort to tease out a half-decent combination of settings" in contrast to GNOME's "treat your users like irrelevant cretins".
I do try KDE every few years, but it just brings out some kind of visceral disgust in me, without even the morbid curiosity that, say, a dentist's book of pictures of tooth decay and gum disease can bring out.
And I still refuse to believe that "offer a preference for everything anyone can come up with" is a responsible response to the design problem. I'd rather have a bunch of alternative applications each with their own coherent philosophy than a stinking pile of compromise. Though I have to wonder whether experience isn't trying to teach us that a stinking pile of compromise is all we can hope for from a Free desktop.
The fact that the situation is so much better for developers on other platforms is a reason to take issue with the article's title: "Why Free Software has poor usability, and how to improve it". When it says "Free software" it primarily means "Free software on Linux". Free software badly ported to the Mac sucks too, but that's just Linux exporting Linux's problems, not a problem with Free per se. Free software written by Mac developers for the Mac tends to suck a whole lot less than Free software written by Linux developers.
Did the developers of Adium (a Mac IM client) ever make a change so unpopular with their users, and have the cheek to tell their users that they were wrong to want the feature, causing a fork? No. But this actually happened on Linux: Slashdot: Pidgin Controversy Triggers Fork.
Vienna, a Free RSS reader for Mac OS, has a nice UI; admittedly not quite as good as its closed-source counterpart NetNewsWire, but still better than any Linux equivalent. TeXShop is a really nice front-end to TeX. Apple gives out Apple Design Awards to the Free (or at least OSS) Mac software, and not in any kind of sarcastic way. There exist Free applications for the Mac that are good in their own right.
The fact that all Mac developers are using the same high-quality toolkit directly influences Free Mac app quality, as does the better default choice of application implementation language. The fact that Mac developers can't help but be exposed to large numbers of other apps with high-quality UIs, setting a good example ("a higher bar", in the parlance of our times) has a direct influence on Free Mac app quality.
Seriously, suppose you'd only ever used Linux. Just how high would your standards be?
Which isn't to say that all Free Linux apps suck, or all Free Mac apps rock. But it does help to explain why most Free Linux apps are way down the "suck" end.
The fact that I bring up RSS, and the awfulness of Linux desktop RSS readers, brings me to a final concern. Linux desktop applications are so bad that they push people towards web apps. Which is possibly good for the users, but it's not good for the Linux desktop. And, in a way, I don't think a world of only web apps would be good for web apps; today's web apps are as good as they are because they've been striving to compete with desktop apps on desktop apps' terms (irrespective of the locational transparency, availability/backup/disaster recovery, and other "natural" advantages of web apps).
Driving everything onto the web would be one way to claim victory in terms of Linux desktop usability, of course. All you'd need is a browser, and Firefox is already pretty good.
I'd love to believe we can see things through, but I'm not even convinced we're moving in the right direction. Or any direction. Maybe we're just going round in circles. Certainly these arguments, the points made, the examples given, all sound very familiar.
As I said at the beginning of this series, I've had a Linux desktop, on and off, since about 1997. I've had a Unix desktop, in one form or another, since 1993. I even wrote my own window manager in my youth, when I too had more free time than sense. Assuming that Mosaic was an acceptable browser for the web as it was back in 1993, have we really made any progress on the (non-Mac) Unix desktop? As much progress as Apple or Microsoft has made? You could argue that Indigo Magic (SGI's desktop in 1993) or NeXTStep weren't as Free as what we have now, which is true. Or that both of those Unix desktops were way ahead of contemporary versions of Mac OS and Windows, which is also true (if you ignore availability of applications). But still, the main way in which I feel better off in 2008 than 1993 is that I have a bigger display with more colors. And that's not exactly thanks to better software.
Maybe some day, Mark Shuttleworth will turn up to work and say "I'm sick of all this pointless ugly Compiz crap and having to pretend in interviews that I sincerely think Compiz is in any way 'better' than or even remotely related to what Apple has, or even a sensible thing to pay any attention to. I understand now that transparency or wobbliness or shadows isn't what makes a good desktop: good apps make a good desktop, and from now on we're only going to stick apps on the Ubuntu menu if I can honestly stand in front of a non-fanboy audience and say 'this app works, and works well' and not be jeered off stage; in fact, I'm going to start kicking the worst ones off the menu unless they shape up, and we'll roll up our sleeves and fix the least worst, based on the rulings of a benevolent UI dictator".
And even supposing this happens, where would Shuttleworth find himself a suitable dictator? As the Pidgin developers inadvertently demonstrated, it's hard to find a benevolent dictator in a world of anarchic zealots. Maybe as hard as it is to find an honest politician in the real world.
And even assuming we had a way to be sure we were moving in a positive direction, our tools aren't the kind that help us move fast.
Honestly, I see no end in sight. Gruber's article will be as relevant in 2012, Thomas will write the third version of his article in 2014, and I'll still need a Mac as a crutch for my Linux box on the day I retire.