Desktop Linux suckage: Index

Blogger makes it a pain to read a series if you weren't subscribed to the feed as the posts were made, so here's an index/table of contents for use as a jumping-off point:

(Thanks to Eric Bloodworth for suggesting adding an index. I should have thought of it myself after reading Ian Lance Taylor's series on linkers. Hint: the easiest way to read that series is to increment the number in the URL.)


Desktop Linux suckage: where's our Steve Jobs?

[This is a long one, but it's the end of the series. Normal service will be resumed shortly.]

The article that prompted me to write these rants, "Why Free Software has poor usability, and how to improve it", failed to mention Linux's lack of a Steve Jobs. This is an important omission. The Linux kernel has its Steve Jobs. The question we have to ask ourselves is whether it's just a coincidence that the Linux kernel is the poster child for successful Free software.

What does the Linux desktop have? The Linux desktop has nothing. No person, not even a committee. Just the GNOME HIG.

I suggest you take a look at the GNOME HIG, in all its barely-changed-since-2001 glory. In particular, compare its breadth, depth, and quality to Apple's or even Microsoft's equivalents. You might think that the Apple and Microsoft guides go too far. Microsoft's one is especially long. But you could hardly argue that developers of GNOME applications are doing such a good job they don't need at least as much help as Mac and Windows developers. At least Apple and Microsoft seem to acknowledge the extent of the problem.

At least Apple and Microsoft seem to understand that developers need help: developers need better languages than C, developers need higher-level toolkits that do ever more of the stupid work that doesn't distinguish one application from another, and developers need documentation that helps them with the stuff that the toolkit can't. (Note the order in which I mention these things. There's significance to it.)

If you hang around Bugzilla enough, you'll start to notice "need direction from the HIG" as a polite way of saying "will not fix". You may as well rely on Santa Claus to buy your wife an anniversary gift as rely on the GNOME HIG being updated to reflect modern application trends.

Ideally, the HIG would be a strong document and there would be libraries that made it easier to write good applications than to write bad ones, and we'd be using a language well-suited both to that library, and to application development in general.

What we actually have is the GNOME HIG, GTK+, and C. Oops.

For the Linux desktop developer, everything is such hard work that it's not hard to see why no-one ever gets round to the all-important polish. Or why no-one has the agility to experiment until they get their UI right, rather than slapping together the first thing they think of, getting it half working, and being too scared to touch the tower of cards lest it topple before their eyes.

Do Mac programmers have the same problem? No. Quite the opposite. Cocoa does all the stupid crap for them, so instead of everyone writing their own code to save and restore window position (et cetera) programmers and users can just take that basic stuff for granted, and spend development time on the stuff relevant to their application. Their language is their weakest link, being Objective C, but since its major problem is that, when Cocoa doesn't already offer what you need, you end up falling back to C, it's obvious that that's a net win compared to the usual GNOME situation. Objective C also recently gained garbage collection. GNOME has a weak toolkit (GTK+) and a hopeless revenant language (C).

There are GTK+ bindings to other languages, but in my experience, they're fragile, incomplete, outdated, and/or unsupported. The three serious contenders for an alternative to C, namely C++, C#, and Java, are probably all too politically contentious to succeed without causing serious upset. (The amusing part being that most of the "C++ is too complicated", "C# is too evil/case-insensitive/Microsoft-encumbered", "Java is too slow" crap is likely to come from people wholly uninvolved in, and probably incapable of, writing software). I'm hoping OpenJDK might improve matters here, but you'll forgive me if I don't risk holding my breath.

I realizing I've been ignoring KDE (based on Qt and C++), but they're just a somewhat better implementation of an orthogonally awful philosophy. "Treat your users like technology fetishists who want to piss away their lives fiddling with myriad options in an effort to tease out a half-decent combination of settings" in contrast to GNOME's "treat your users like irrelevant cretins".

I do try KDE every few years, but it just brings out some kind of visceral disgust in me, without even the morbid curiosity that, say, a dentist's book of pictures of tooth decay and gum disease can bring out.

And I still refuse to believe that "offer a preference for everything anyone can come up with" is a responsible response to the design problem. I'd rather have a bunch of alternative applications each with their own coherent philosophy than a stinking pile of compromise. Though I have to wonder whether experience isn't trying to teach us that a stinking pile of compromise is all we can hope for from a Free desktop.

The fact that the situation is so much better for developers on other platforms is a reason to take issue with the article's title: "Why Free Software has poor usability, and how to improve it". When it says "Free software" it primarily means "Free software on Linux". Free software badly ported to the Mac sucks too, but that's just Linux exporting Linux's problems, not a problem with Free per se. Free software written by Mac developers for the Mac tends to suck a whole lot less than Free software written by Linux developers.

Did the developers of Adium (a Mac IM client) ever make a change so unpopular with their users, and have the cheek to tell their users that they were wrong to want the feature, causing a fork? No. But this actually happened on Linux: Slashdot: Pidgin Controversy Triggers Fork.

Vienna, a Free RSS reader for Mac OS, has a nice UI; admittedly not quite as good as its closed-source counterpart NetNewsWire, but still better than any Linux equivalent. TeXShop is a really nice front-end to TeX. Apple gives out Apple Design Awards to the Free (or at least OSS) Mac software, and not in any kind of sarcastic way. There exist Free applications for the Mac that are good in their own right.

The fact that all Mac developers are using the same high-quality toolkit directly influences Free Mac app quality, as does the better default choice of application implementation language. The fact that Mac developers can't help but be exposed to large numbers of other apps with high-quality UIs, setting a good example ("a higher bar", in the parlance of our times) has a direct influence on Free Mac app quality.

Seriously, suppose you'd only ever used Linux. Just how high would your standards be?

Which isn't to say that all Free Linux apps suck, or all Free Mac apps rock. But it does help to explain why most Free Linux apps are way down the "suck" end.

The fact that I bring up RSS, and the awfulness of Linux desktop RSS readers, brings me to a final concern. Linux desktop applications are so bad that they push people towards web apps. Which is possibly good for the users, but it's not good for the Linux desktop. And, in a way, I don't think a world of only web apps would be good for web apps; today's web apps are as good as they are because they've been striving to compete with desktop apps on desktop apps' terms (irrespective of the locational transparency, availability/backup/disaster recovery, and other "natural" advantages of web apps).

Driving everything onto the web would be one way to claim victory in terms of Linux desktop usability, of course. All you'd need is a browser, and Firefox is already pretty good.

I'd love to believe we can see things through, but I'm not even convinced we're moving in the right direction. Or any direction. Maybe we're just going round in circles. Certainly these arguments, the points made, the examples given, all sound very familiar.

As I said at the beginning of this series, I've had a Linux desktop, on and off, since about 1997. I've had a Unix desktop, in one form or another, since 1993. I even wrote my own window manager in my youth, when I too had more free time than sense. Assuming that Mosaic was an acceptable browser for the web as it was back in 1993, have we really made any progress on the (non-Mac) Unix desktop? As much progress as Apple or Microsoft has made? You could argue that Indigo Magic (SGI's desktop in 1993) or NeXTStep weren't as Free as what we have now, which is true. Or that both of those Unix desktops were way ahead of contemporary versions of Mac OS and Windows, which is also true (if you ignore availability of applications). But still, the main way in which I feel better off in 2008 than 1993 is that I have a bigger display with more colors. And that's not exactly thanks to better software.

Maybe some day, Mark Shuttleworth will turn up to work and say "I'm sick of all this pointless ugly Compiz crap and having to pretend in interviews that I sincerely think Compiz is in any way 'better' than or even remotely related to what Apple has, or even a sensible thing to pay any attention to. I understand now that transparency or wobbliness or shadows isn't what makes a good desktop: good apps make a good desktop, and from now on we're only going to stick apps on the Ubuntu menu if I can honestly stand in front of a non-fanboy audience and say 'this app works, and works well' and not be jeered off stage; in fact, I'm going to start kicking the worst ones off the menu unless they shape up, and we'll roll up our sleeves and fix the least worst, based on the rulings of a benevolent UI dictator".

And even supposing this happens, where would Shuttleworth find himself a suitable dictator? As the Pidgin developers inadvertently demonstrated, it's hard to find a benevolent dictator in a world of anarchic zealots. Maybe as hard as it is to find an honest politician in the real world.

And even assuming we had a way to be sure we were moving in a positive direction, our tools aren't the kind that help us move fast.

Honestly, I see no end in sight. Gruber's article will be as relevant in 2012, Thomas will write the third version of his article in 2014, and I'll still need a Mac as a crutch for my Linux box on the day I retire.


Desktop Linux suckage: 10 years

I often mention Joel Spolsky's article Good Software Takes Ten Years. Get Used To It. It's one of the best things he's ever written. One thing Joel doesn't promise in that article (or any other) is that the ten years it takes to create good software will be ten years of fun. He doesn't promise ten years of interesting code or ten years of great new features. Quite the opposite, in fact.

Most of your ten years will be spent on problems with Danish keyboards on Mac OS 10.3, or fonts on the Thai version of Windows 98, or some bug in some IMAP server that this one guy in Kabul sometimes has to connect to, or just making your error messages clearer, more specific, and more focused on helping the user fix their problem.

One of the big things that Joel explicitly states, because he has to, is that contrary to what we leave university thinking, these "hacks" are the most important bits of code we write. They aren't nasty bits of cruft that we should be marching out into the world to do away with, in order to make the world a better place. They're nasty bits of cruft that we're marching out into the world to add, in order to make the world a better place. They make stuff "just work", even though the hardware's broken. They make stuff "just work", even though the software they have to interact with is broken. They make stuff "just work" for most users because making 80% of your users happy is better than waiting for the perfect solution to come along and making no-one happy in the meantime.

The real world is full of lossage. The very phrase "real world" is practically defined by an admission of the existence of lossage. Industrial quantities of the stuff. But the academics mislead us, wanting to reshape us in their own image, and would have us believe that what the world needs is more beautiful theories and, maybe (and only if you can do it without getting your hands dirty) a partial implementation of the beautiful theory. They can only say this with a straight face because they are, at heart, ascetics. That's why they're in universities, for dog's sake. I'm not specifically talking about desktops right now, but their idea of a desktop is probably a single copy of vi running in a single XTerm, where they're editing some 10-line Haskell program with really clever puns in all the function names.

The presence of puns and absence of side-effects is what passes for quality in their world.

By the time we realize these people were lying, or deluded, or naive, or whatever, it's usually too late. We're too busy to give people what they want or need for free any more, so the Linux world has to make do with what our younger selves thought people ought to want. A hundred half-finished flying machines and not one working toilet.

Welcome to paradise. You might want to consider getting constipated.

There's a picture in Apple's HIG, in the section Prioritizing Design Decisions. I'll describe the picture in case the link breaks, or you just can't be bothered to follow it. The picture shows three blocks stacked atop each other. The largest block, on the bottom, is "Minimum Requirements". The next block up, somewhat smaller, is "Features users expect". The top, smallest, block is "Differentiation". The idea being that you start with the big block on the bottom, and work your way up.

Again, no genius. No "rocket science". Just common sense and good engineering. (If you actually read that chapter of Apple's HIG, you'll find that what they're actually talking about is very pedestrian stuff indeed. But at least it's an ethos.)

Problem: Linux desktop apps don't care about "Features users expect" or "Minimum requirements" at all. They may offer some "Differentiation", but not necessarily. Mostly, you just get "The bits that are fun to code".

I said Joel didn't promise the 10 years would be fun. There's something else he didn't promise. He didn't promise that 10 years would be sufficient. Necessary? Yes. Sufficient? Clearly not.


Desktop Linux suckage: a prime example

I've said before that, in my experience, filing bugs with Ubuntu seems more effective than filing them upstream, and a while back I filed a minor bug against GDebi. GDebi is a GUI .deb package installer that's pretty cool: it effectively lets you click on a link to a .deb in Firefox and install that program all without touching the command-line. The bug was "Size" field in "Details" tab should show KiB units), an I included a patch, and got it accepted and applied as quickly as any bug relating to the Linux desktop I've ever filed.

If you follow that link now, you can see that someone's just pointed out that, no, when apt-get(1) says "kB" it's not lying, it's completely serious. A quick apt-get install and check of /var/cache/apt/archives suggests they're right, at least as far as reports of download size are concerned. (It's harder to measure installed size, but I've no reason to doubt the claim.)

What I do doubt is the sanity of the person who made this design decision in the first place.

And while I accept that my patch was incorrect, and GDebi now claims file sizes are KiB rather than the kB they actually are, I also doubt the sanity of the person who actually thinks we should display file sizes in kilobytes rather than kibibytes. (Especially considering Nautilus, the GNOME "file manager" does actually use KiB, even if it calls them "KB". Especially considering the recent class-action lawsuits against the drive manufacturers. But I'm preaching to the choir here.)

Any time I try to kid myself that one day we'll have a non-crap Linux desktop, what makes me lose hope is that the Linux desktop isn't the kind of place where stupidity gets fixed or even papered over. There's no-one in charge to tell people to get their act together, and everyone has their own agenda, and half of them ought to be in a padded cell with no access to a keyboard.

At the moment, we have a situation where some applications are using kilobytes, some are using kibibytes, and what these applications appear to be claiming to use may or may not correspond to what they're actually using. And all this breakage surrounds something that's trivial in every way. The home computer operating system I was using in the late 1980s had a library call that you gave a number of bytes to, and it gave you back a string to show the user. Those old guys weren't geniuses, they were just good engineers.

The Linux desktop has neither, and it's also lacking a benevolent dictator to at least herd the monkeys. Or whatever it is one does with monkeys, other than spank them.

It's not like we're talking about something as "hard" as globally respecting a date or time format preference. That's so far in the Linux desktop's future that at this rate, I might actually have retired before it happens.


Desktop Linux suckage: a quick exercise

One claim you'll sometimes hear is that, although Mac OS or Windows are better for stuff like word processing, spreadsheets, presentations, reading mail, drawing, painting, cataloging and touching up digital photographs, games, and all those other things that only real people do, Linux is best for hard-core nerdy shit.

Coincidentally, I just so happen to have a hard-core nerdy desire, one that, by this theory, Linux ought to do way better on than Mac OS? Yeah, it's an unscientific sample size, but it still might be instructive, and it's an actual question I want answered right now...

The problem
I want to know how many DIMM slots my motherboard has, and what DIMMs I have fitted.

Apple's solution
On the Mac, I click on the little Apple icon, I click "About this Mac...", I click "More Info..." and System Profiler starts, letting me see any detail about my system's hardware and software, including a convenient detailed display what DIMMs are in which slots. It even lets me save a handy .txt summary for inclusion in bug reports and the like.

Linux's solution
On Ubuntu, on the best-avoided crock that is the "System" menu, I have both "About Ubuntu" and "About GNOME". Neither contains a single piece of useful information. System Monitor, which I always have running because I like visual feedback telling me whether my computer's actually busy, waiting for I/O, or just ignoring me, has recently grown a perfunctory summary of my hardware that's not really useful yet. Maybe it'll improve, but I'd rather see a useful "About" menu item rather than rely on a panel applet that isn't there by default. If I go to someone else's system, I don't want to have to start installing stuff to help answer their simple hardware questions.

So I start a terminal and cat /proc/meminfo, but the information there is too high-level for my present purposes. I tab-complete on "ls", thinking there might be some equivalent of lspci(1), but the promising-looking lshw(1) is another dead-end, telling me only the total amount of memory fitted. Okay, screw guessing about on the command line. I know there is a command, because I've used it before, but I can't remember it, and I was supposed to be using the desktop anyway.

I start Firefox and search the web for "dimm equivalent of lspci" and a few variations before I stumble on "lspci ram" on Yahoo (I never did find anything relevant via Google, even with the search string that works with Yahoo). This gets me a handy Gentoo page (naturally) Detecting your Hardware which suggests hardinfo(1) which, once installed, gives me a GUI which is indeed pretty similar to the Mac's System Profiler. The memory information, though, is /proc/meminfo in a GTK+ text area. The authors seem to have been more interested in benchmarking than hardware inventory. I mutter something about Gentoo users, uninstall, and go back to searching.

The only other GUI I found was sysinfo(1) which was almost like someone's joke entry in a "worst ever Linux software" competition. When it starts, the UI is animated, but doesn't actually show anything that has any meaning. There's a preference dialog, mostly taken up with a preference to turn off the pointless start-up animation. (And don't get too excited by my use of the word "animation". I use it in the loosest possible sense. Think of Windows XP's lame bouncing arrow pointing to the "Start" button.)

sysinfo's status bar constantly shows the date (in some random localized time format that I certainly never asked for). If you expectantly click on "Memory" in the list of options (decorated for some obscure reason with the icon usually reserved for copy-to-clipboard menu items), you see two sliders showing the free/used fraction of "Memory" and "Swap". There's a hopeful-looking disclosure triangle labeled "More details", but that just shows "Cached", "Active", and "Inactive", which I know by now are just the next three lines of /proc/meminfo. Looking to see who was responsible for this travesty, I noticed that the Close button in the about box doesn't work, which I thought was a particularly nice finishing touch to the encounter.

At this point, I resigned myself to either not knowing what DIMMs I have fitted, or opening the case to take a look.

I don't present this anecdote with any particular interpretation in mind, nor because the specific example is particularly representative, but it should at least serve as a warning to anyone foolish enough to think that "at least Linux is okay for the nerdy stuff". No, it's really crap there too. We have too many half-baked attempts at writing the same few things, and almost nothing that's followed through to a useful "shippable" point.

We're stuck with our 1980s XTerms and our 1970s Emacs/Vim (or our IDEs written by people paid to work on them by commercial vendors) and our compiler, which alongside the Linux kernel is one of our few real successes. Like the Linux kernel, though, many commercial vendors pay people to work full-time on it. We kid ourselves if we think that these are fruits of the "community" of bedroom hackers. Those are the people churning out the the half-baked "utilities" that had more time spent on their about box and their start-up animation than on their usefulness or usability.

On the internet, no-one knows you're 14 year old or a raccoon. Until they use your software.

David Bristow reminded me of dmidecode(1) which was the command-line program whose name I couldn't remember that does provide the relevant information though (as David admits) in somewhat hard-core form. Harald Koch explained how to get lshw(1) to do the right thing: "sudo lshw". He also suggests "sudo lshw -short" which provides a much more convenient overview style of output than I'd seen before. To be fair, when not run as root, lshw(1) does say "WARNING: you should run this program as super-user". But it makes the common command-line mistakes of (a) outputting this first rather than last, (b) continuing anyway rather than exiting or insisting on some kind of --partial option before doing the wrong thing, and (c) going on to produce so much output that the warning scrolls off the top of the display or is lost anyway. Sadly, no-one cares enough about command-line usability these days to teach the lost art.

Playing more with lshw(1), it turns out to have a companion GUI called lshw-gtk(1), but the interface is truly awful. You know NeXT's awkward column view interface style that's been slowly dying out in Mac OS because it just doesn't work well? lshw-gtk(1) goes to some effort to recreate that, complete with useless top levels and a requirement that you double-click rather than single click to see a node's children. The Debian package also causes it to hide itself in System > Preferences (surely Administration would make more sense?). Despite these problems, it's still the least worst GUI solution I've seen. The individual panes are really nicely presented if you're persistent enough to navigate all the way down to them. And the "you really need to run this as root" warning is a lot stronger and does actually make you click a button before it'll do the wrong thing. But honestly, my recommendation would be "sudo lshw -short" in the terminal emulator of your choice.