Desktop Linux suckage: Rhythmbox

Less important than a web browser, but probably next on the list of mainstream fundamentals is a music player. It's the only other thing I run daily that I haven't already mentioned. Even handy stuff like Wireshark (which gets an exemption from criticism in this series because it's by nerds for nerds) only gets run once a week.

I've said in the past that Rhythmbox is okay, but that was before I was using it as my only mp3 player, rather than as an occasional stand-in. In practice, well, it has a large number of small but annoying flaws. I'm not going to mention them all here. Just a representative sample.

Let's start near the top of the UI: Rhythmbox uses a slider to show how far through the current track you are. This has the annoying consequence that your clicks are interpreted as meaning "move one 'page' in the direction of the mouse", rather than "move to the time corresponding to the location of the mouse". This is completely useless; it might make some sense if clicking to the left of the knob meant "back 10s" and to the right meant "forward 10s", but the time skipped being proportional to the length of the track makes it useless.

I could ignore the fact that a standard slider looks wrong, but it's hard to ignore that it feels wrong. Only, this being the Linux desktop, it gets worse. For some reason, if you actually try to drag the slider, it has a habit of locking up playback. The work-around seems to be to switch to another track, and then switch back, and start playing from the beginning, having learned your lesson. Absolute positioning in the track just doesn't work, so the lesson is "don't touch the slider". Awesome. Anything that works that badly should probably just be left out of the interface until the first version where it actually works. Not having the option of spoiling playback and annoyed by a control that doesn't move how I want it to would be less distressing than just not having any pretend control at all. (GNOME bug 437987 and GNOME bug 445764 may be this.)

An alternative improvement would be to disable the slider, making it evident that it's only useful for showing how far through the track I am. (Funnily enough, I found a bug suggesting that the slider be made worse rather than better: GNOME bug 389170.)

An annoyance that's interesting for what it says about how Linux lowers one's expectations is that there's no visible suggestion that the "media keys" on the keyboard work, and I consider that a problem. The stupidly large toolbar (GNOME bug 343660) and the menu mention the more awkward keyboard equivalents; stuff like alt-space for playback (an annoying difference from iTunes' unmodified space, for no obvious reason, even though unmodified space isn't taken and would make iTunes refugees feel more at home at no cost to anyone else), but there's no suggestion of the media keys.

I realize it's deeply sad that I can't just assume they work, but this is Linux, and it wasn't many releases ago they didn't work. And it's worth bearing in mind that I had to write a custom rc.local script to get my keyboard to work at all with Ubuntu 8.04! (I actually installed KDE and tried amaroK, and note that my media keys don't work there.)

The search field is in the wrong place, too. Where do search fields go? In the top-right of your window. I've made this UI mistake myself in the past, but it's a well established convention now. Firefox uses it (for searching the web, rather than the current document), as does Safari. iTunes uses it. Ubuntu's Add/Remove Applications uses it and Ubuntu's help browser uses it. Microsoft's developer documentation for Vista explicitly says "thou shalt". But Rhythmbox just kind of hides it in the middle of a bunch of other UI clutter.

(GTK+ apps in general make very poor use of empty space; the widgets seem to enforce large amounts of arbitrarily wasted space around themselves, and developers seem, perhaps in reaction to this, to use it only rarely in ways that actually help convey meaning. Mac developers tend to be best at this, perhaps because Apple over-exaggerated in early versions of Mac OS X to get the point across. Good Mac apps tend to have far fewer controls and preferences, of course, which gives developers more room to play with.)

Did I mention that the window manager's stupid "top-left corner" behavior is exacerbated by Rhythmbox not explicitly repositioning itself where it was last? GNOME bug 502827. (Sad, of course, that individual applications need to work around systemic lameness, but remembering your last window position is a good basic feature to have anyway.)

There's advertising in the UI, too. In the latest version, two net music stores have appeared. I thought one advantage of not using iTunes would be to not suffer that crap, but I was wrong. To be fair, judging by the text that disappears before you have chance to read it (and how do you get it back?), Jamendo is actually a "store" for Free music; Magnatune not. But it's unclear, and feels like thin ice that really needs better explanation. Unlike iTunes, the "stores" aren't very store-like, either. They look just like your normal music collection, which isn't a very accessible way to browse unfamiliar music.

But what drives me mad the most (worse than even the slider) is the overall window behavior. When you start Rhythmbox, an icon appears in the top panel, which generally signifies some kind of "background" application; the kind of application whose window you can close without thereby quitting the application, because you can click the panel icon to get the window back. But no, clicking the close icon quits Rhythmbox. No more music. You can click the window's minimize button, but that still wastes space in the list of windows at the bottom of the screen.

You can get better behavior by clicking on the panel icon, funnily enough. If the window is visible, clicking the panel icon (somewhat bizarrely, and with visual feedback to emphasize the idea) minimizes the window "into" the panel. Clicking again re-opens the window, as you'd expect. But if you've just been fiddling in the window, which you probably have if the window's visible, you're focus is on the window and your pointer is in the window. Nowhere near the panel icon. So it's way more convenient to close the window, but closing the window, despite the existence of the panel icon, quits Rhythmbox and stops the music. This is GNOME bug 158168 from 2004, which the developers don't think is a bug.

It would appear from GNOME bug 523072 as if Rhythmbox is equally evil if you're ripping a CD at the time. I can imagine that's even more annoying!

I came up with a patch, ran a patched version for a few weeks, and was much happier. But what's the use of a patch when the developers think the current behavior is not just acceptable, but good? The fanboys who say "patches welcome" whenever someone complains miss the point; patching is necessary but not sufficient. Realistically, you probably have to fork, and get your distribution to package your fork instead. And that's not something you can do on a feature-by-feature basis. You could try to file the bug against your favorite distribution instead, and put the maintenance burden onto the package manager, but that really doesn't seem like a healthy way forward.

Am I really destined to end up running Gentoo? Building everything from source and applying my own little set of patches on my local machine? I think I'd rather buy a Zune.

At this point, I should be honest and point out that I simply don't know what iTunes does if you close its window. Mac OS solves the underlying problem in a more fundamental way: command-h gets any application out of my face in a single keystroke. (My complaint there is that, by default, there's no visual indication that an application is hidden, which I find confusing, especially in combination with the way Mac OS' dock shows some non-running applications too. There's a hidden preference to fix that, though, and it's about the only bit of manual configuration I have to do on a new Mac.) It would be great for Linux to have a similar system-wide fix, but this is Linux, so there's no "system" (hell, there's a whole other desktop I haven't mentioned that's the default in some distributions, and there's a third semi-popular one), and Linux has no Steve Jobs insisting that stuff is not allowed to suck.

[I am reliably informed that iTunes does the right thing. Imagine my surprise.]

So given all those Linuxy lemons, you'd be well advised to have your application know how to make tasty lemonade.

Did I mention that Rhythmbox never remembers its window location? This isn't just a problem if you quit and re-start. It's a problem if you click the panel icon twice to hide and then re-show the main window, because that resets the window position to the top-left. Seriously.

Did I mention that if you quit Rhythmbox (which, as I've said, is easier to do by accident than it ought to be) it forgets what you were playing? I'm not asking for fancy stuff like remembering scroll bar positions and search field contents and multiple selections and the like: just remember which track was playing last, please!

Did I mention that the mis-positioned search field isn't activated if you use control-f (or, if you think of it more like Firefox's "Web Search", control-k)? The former does nothing (unless focus is already in a component that has some default behavior for the keystroke), while the latter is "Play Queue as Side Pane". I'm not sure, but I don't think "Play" is a verb in that sentence.

Did I mention that canceling the search field doesn't keep the currently-playing track visible in the table? I realize the "right" behavior is pretty hard to specify if you want to cover all the special cases, but there's value in doing the right thing in the most common case.

Don't get me wrong: I actually quite like Rhythmbox. It's almost good. And I do think it's a case where a bit of "ronco spray-on usability" could make the difference. I bring this up because I don't want people to think that there isn't low-hanging fruit out there. The little panel icon? It's great. I love its little notifications when the track changes, and I love that I can turn it off if I want, and I love that I turn it off in the place where it appears. All nicely done. I want to use it more, and I want Rhythmbox to stop coming across as if it thinks it's the most important app on my desktop, when 99% of the time that little panel icon is all I need to see. If listening to music were my primary occupation, I wouldn't be running Rhythmbox on Linux; I'd be running the iPod software on an iPod.

But back to my patch. My experience has been that, contrary to what you might imagine, sending patches to the most-upstream source doesn't work well. Perhaps because most of my patches fix things so obviously broken that the only reason the patch is needed is because the developers are either actively hostile towards, or simply indifferent to, their users. The distributions tend to be a better place to go. Sure, they apply patches that make your SSH insecure (Debian) or your Perl slow (RedHat), but at least they get things done.

Anyway, I went to launchpad, Ubuntu's bug database. And I found Bug #38512 in Rhythmbox: "Window close should close, not quit". Reading that bug's comments we learn a great many things. I could summarize the thread, but really it's so educational I think everyone who's interested in the state of the Linux desktop should read the whole thread themselves.

I'll wait.

If, having read that thread, you want to see the alluded-to tantrum by one of the upstream developers, you should read the upstream bug Bug 318629 - Option to minimize to tray on close or minimize (rhythmbox). (Note that a developer calls a user a "dick" before a user calls a developer a "dick".)

Everything that's wrong with the Linux desktop is beautifully illustrated in those two bug reports.

Problem: everyone wants users until they've got them. When they've got them, they want to reshape them into their own image. When that fails, as it always will, they begin to hate their users.


Desktop Linux suckage: an interlude

Before you sink too far into despair, let me say: not everything is hopeless.

Firefox is mostly okay. I use Firefox 2 and Firefox 3 on different machines on a daily basis, and I'd be hard-pressed to tell you why you need to upgrade. I can name a few things that are different, but I don't know that they're necessarily better. But Firefox 3 isn't so much worse that I'm cursing the mother that bore it, and that's pretty good for a Linux desktop application major-version upgrade.

The GNOME System Monitor doesn't seem to change much, especially considering how often it's mentioned in the release notes ("now with extra Cairo!"), but it's every bit as "okay, I suppose" as it ever was. I think whoever writes the release notes either needs to concentrate more on stuff real people would care about, or, conversely, has an excellent feel for their audience.

The weather/calendar/date/time/world-clock applet (yes, all those things really have merged into one) is equally, well, "okay".

I use Terminator as my terminal emulator, and it's worth pointing out that it runs better on Linux than on any other platform. Fundamentally, Linux is a good platform to develop for. I use Evergreen as my editor and it, too, runs better on Linux than on any other platform. But, important though they are to me, those are two applications by developers for developers. I'm not presenting them as examples of good Linux desktop applications, suitable for the mainstream. It's okay to expect their users to understand regular expressions and know how to write scripts and all that, and their users are people for whom "Python scripting interface" or "rewritten in Objective MonkeyPoop 7.3" might actually be a feature (the kind of crap you see all too often on the web pages of Linux music players, say).

Terminal emulators and IDEs, despite what we developers might think, aren't core parts of the desktop that everyone from beginner to expert needs to be able to use, and they're not apps it's important to come to a "good enough" consensus on. They're sovereign apps that the people who use them use day-in day-out for years. These people are prepared to devote significant time and effort into learning how to best use them, because it pays off, and they know it. Everyone has their own sovereign application. Yours might be Photoshop. His might be Word. Hers might be Keynote.

I fear for all of those people, but the Linux desktop is years or decades away for them, so maybe it doesn't matter. The people we could usefully reach are the people for whom the web browser is the sovereign application.

Most people wouldn't know a terminal emulator or an IDE if it bit them, and that's as it should be. I don't know how to drive, you don't know how to spacewalk, and that guy creeping up behind you with the knife isn't a hairdresser. These are specialist skills, often requiring specialist tools, corresponding to sovereign applications. We do all know how to open doors, climb stairs, and use toilets. These are non-specialist skills requiring no specialist tools, and correspond to things like calculators and mp3 players.

Notice that it's depth rather than subject area that's important; a calculator isn't a sovereign app, but Mathematica is. An mp3 player isn't a sovereign app, but Logic Pro is.

There's another simple rule of thumb for distinguishing the two kinds of app, and it's this: if Apple or Microsoft will sell you an application separate from their OS, it's a sovereign application. If they give it to you for free (including new versions; the first iPhoto is always free, just to get you hooked), it's not a sovereign application.

Funnily enough, one of the most frequent problems with non-sovereign applications on Linux is that they think they're sovereign applications. More about that in the next post.


Desktop Linux suckage: Evolution

Ah, Evolution. Humans are known for allowing hope to triumph over experience, so although I've been public about my dislike of Evolution (the crappy half-finished mailer, not the scientific theory) since the beginning of this blog, I still give it another quick go with each release of Ubuntu.

Assuming the version that shipped with each Ubuntu has been different each time, and you'd certainly be hard-pressed to tell at times, that's at least six versions I've tried. And all have sucked.

I'll gloss over the experience of trying Evolution version crap.useless to save myself the anguish. Evolution continues to be so appalling that the future for webmail would look as bright as ever amongst Linux users, even if the general public weren't webmail-mad. Despite competing in a category of applications that's practically defined by its members' lameness, Evolution is so beyond hope that I won't waste any more time or any more breath on it.

If you think I exaggerate, you don't even need to install and try Evolution. You can learn a lot from the Evolution FAQ:

Why cannot I reset word wrap setting for outgoing mails at 72 characters?
Sorry, the value is hardcoded.

Can I change/personalize the "On <date>, <person> wrote:" string when replying?
Currently you can only change this if you compile from source yourself.

In fact, you can probably get a pretty clear picture by just reading one FAQ question's title: "Does a version for Apple(R,TM,C,whatever) Mac OS(R,TM,C,whatever) exist?" There are two other FAQ entries with the same childishness surrounding "Microsoft", "Windows", "Sun", and "Solaris".

I think that tells you all you need to know.

Don't get me wrong: I think having a vision or an ethos is important, and it's a good and wise decision to not add every single feature anyone on the internet can think of. At the same time, I think it's even more important that you communicate the vision. Especially if one of your philosophical decisions has become a FAQ. That's people's way of telling you they don't understand what the hell you're trying to say. (And to be perfectly clear, though this shouldn't need to be said: that's your fault, not theirs.)

According to their web page "Evolution provides integrated mail, addressbook and calendaring functionality to users of the GNOME desktop". That sounds pretty inclusive to me. That says mainstream to me. That says "we are to the GNOME desktop what Mail.app and Outlook are to Mac OS and Windows". Not "we will give smart-arse answers to perfectly reasonable questions".

Somehow, this "fuck you" attitude manages to carry over to the UI, which would be a neat trick if that were actually useful.

There's Thunderbird, too, but that's like Evolution for people who like more knobs and switches. (Plus the handful of people who actually miss the built-in mailer from late 1990s versions of Netscape.)

I wonder how we got here? Once upon a time, not long ago, Unix was the only place you'd find sensible mailers. They were terminal-based, of course. One of them (mutt) even survives, in some of the more beard-overgrown corners of the internet. Various people had already taken us from terminals to GUIs, giving us nicer (and more) fonts, and eventually giving us HTML mail for amazon and netflix and newegg (and spam). But fast full-text search? Good local caching to cope with high latency/low availability connections? Great editors? Nope. We even took a step backwards on that last one, giving up $EDITOR. Not a problem for the mainstream user who doesn't have a favorite editor, but perhaps one reason why the people who might actually write the code have little inclination to do so. "You want me to use Emacs/Vim to write a program, the use of which would mean that I can't use Emacs/Vim as much? Yeah, sure; I'll get right on that."

As regular readers know, I'm neither an Emacs nor a Vim person, but nevertheless it was actually the crappiness of Evolution's editor, and of Mac OS' Mail's editor (in more recent versions; I think they improved HTML editing at the cost of plain text editing) that made it easy to give up both. If I'm going to have a crappy editing experience, I may as well take the browser's crappy editing experience. For one thing, I have to live with editing so many other things in the browser that I may as well learn its quirks. More importantly, unlike most of the Linux desktop, Firefox actually tends to improve as time passes.

I have yet to become a creationist, but I have abandoned Evolution, and I am happily using webmail for all my mail needs; an idea that I would have considered to be one of the direst of heresies in the 1990s.

Problem: no-one who could write one wants a decent desktop mailer for Linux, so there will never be one.


Desktop Linux suckage: the clipboard

X11's equivalent of the clipboard has been broken since I first used X11, back in 1993. 15 years later, things are still as bad as ever they were.

They say hard cases make bad law, and terminal emulators make very hard UI cases. Unfortunately, nerds being nerds, a terminal emulator tends to be the first application written for any Unix GUI. There are two main problems caused by starting with the terminal emulator and generalizing to the other 99% of applications. The more fundamental problem is that terminal emulators expect that most keystrokes can be passed through to the pseudo terminal, including the keystrokes that every other application on your system uses as keyboard equivalents for menu actions.

The X11-specific problem is that XTerm conditioned many long-term Unix users to use the selection instead of the clipboard. (If you're not an X11 user, you probably have no idea what I'm talking about here. Don't worry; we'll get to it.)

The big problem with the X11 clipboard is actually nothing to do with terminal emulators, except in so far as if they'd actually written some real apps instead of guessing what they might be and how they might behave, they probably wouldn't have crippled the clipboard in the way they did.

The easy one first, though. Mac OS uses a modifier key for menu actions (the "command" key) that didn't exist on traditional terminals, cleverly side-stepping the problem. PuTTY on Windows basically does without; a not unreasonable solution. GNOME Terminal uses control and shift (instead of just the "control" key). Terminator uses alt, which used to be popular on Unix, but fell out of favor in Linux times, thanks (I've always assumed) to the influx of Windows users.

As for the second problem, you may or may not know that Mac OS actually has multiple "pasteboards" (as usual, even their terminology is different). Mac OS hides them well enough that real people neither know nor care. Real people using Linux, even if they only use Firefox, get screwed by the old "selection" versus "clipboard" nonsense. Basically, in addition to the usual clipboard with its explicit "copy" and "paste" actions, there's a "selection". To set it you just select some text. To paste it, you press the middle button. (These days, the scroll wheel.) To paste it over existing text (such as in your web browser's location bar)... well, you can't do that. It's roughly that mistake that screws people over.

I see this catch people out at least once a week, and that's amongst X11 users savvy enough to simply shrug, mutter something along the lines of "bloody clipboard", and try again more carefully. As long Linux has no Steve Jobs to stand up and say "this is hurting us, so out it goes", I don't see this getting fixed. Anyone could write the patches to remove X11 selection support. But without a Steve Jobs standing over the relevant projects' maintainers, how could we ever get them accepted? The gatekeepers we have are the XTerm-toting Emacs users who sincerely believe they couldn't live without this shit.

(KDE offers an option to ignore the selection, but I believe it defaults to the old behavior. My work-around is to just never use middle-button paste. It's a lot easier to kick the habit than most old-timers probably imagine, and it makes your muscle memory more portable to other systems.)

This isn't actually the worst part, though. The funny thing about that nonsense is that it's more likely to affect nerds than real people, because real people aren't too likely to come across the selection by accident. In fact, they're likely to give up on Linux before they get that far.

More serious is that there isn't a clipboard in the sense of a central place that stuff gets copied or cut to. The way it works is that it's more like a token that gets handed round. So if you copy in Firefox and paste Rhythmbox, what happened was that at copy-time Firefox said "if anyone wants the contents of the clipboard, they should ask me", and at paste time Rhythmbox asked "who has the clipboard contents?", was told "Firefox", and asked Firefox "could I have the clipboard contents?". Which is all nice and efficient, saving unnecessary copying... until someone copies, quits, and tries to paste. Now the application with the data is no longer running, so it's gone.

This is not how a clipboard should work.

I don't need to tell you that this isn't how a clipboard should work, of course. You have common sense. You're not a paranoid engineer engineering for the worst case (of copying a 2GiB MPEG movie onto the clipboard) and completely ignoring the more common case (of copying 28 characters of text onto the clipboard). Unfortunately, the guys who got here first were paranoid, unrealistic, and broke the system to "protect" it against the possibility of ill-behaved applications.

(Java, despite its Solaris/X11 heritage, comes across as if it believes in the Windows model. So in a Java application, you can't really avoid the copying; the JDK will do it for you. You just don't get any benefit out of it. The lesson toolkit designers seem unwilling to learn, and this applies to modern-day criminals as much as the X11 80s beard crew, is that you can't force applications to behave the way you want. So forget about "all". How do you get 99% to behave well? You give them sensible behavior for free. 99% will take the default behavior, and the other 1% were lost anyway. Fearing and focusing on the 1% makes you about as smart as the TSA, and almost as annoying.)

Problem: we're still living with bad design decisions from the 1980s, and there's neither the will nor the power to get them fixed.


Desktop Linux suckage: Preferences^WAdministration

The management of Preferences on the Linux desktop sucks, too. Or do I mean Administration? Although I know enough Linux users that I can find a defender of almost every bad UI choice, Ubuntu's Preferences/Administration is roundly acknowledged to be a clusterfuck. Especially by anyone who's used a Mac, which does to preferences pretty much exactly what search engines (rather than the early 1990s' "directories") did to the web, making it easy and quick to find what you want if you type what you know into a single text field. But even something as basic as Windows' Control Panel would be a step up.

On the Linux desktop, the user has to decide between "Preferences" and "Administration", and then between, say, "Keyboard" and "Keyboard Shortcuts", or "Network" and "Network Tools", or "Appearance", "Screen Resolution" and "Screensaver". They're being compressed slowly; "Appearance" contains what used to be multiple separate craplets, and instead of "just" two keyboard options, we used to have separate "Keyboard Accessibility" and "Keyboard Layout" choices. But there are still way too many choices to make, and each one is an expensive, painful choice that requires closing the newly-opened window and starting again from scratch if you choose wrong; not just hitting a back button. If you're going to make me guess, give me the fewest possible number of chances to guess wrong.

Better still, don't make me guess my way through your needlessly over-complicated UI at all; let me tell you what I know, and take me straight to the right place, or offer a short tightly-focused list of the likely possibilities if there's more than one.

Apple even goes as far as translating Windows terminology into Mac terminology, in an effort to be more helpful to strangers. Too many Linux users, if not developers, actively hate strangers, especially ones from the lands of Mac OS or Windows.

I count 40 choices on Ubuntu 8.04's "Preferences" and "Administration" menu. Even alphabetical order loses its usefulness when you don't know what word they'll have used. Or whether you're even on the correct 20-item menu.

Who thought that arbitrarily splitting "Preferences" and "Administration" was anything but a crutch to support the bad idea of an unworkable overload of piffling little applications? Stuff that doesn't make sense on a menu, and wants to be in a unified System Preferences or Control Panel application, depending on which background you came from. I'm sure someone, at some point, actually thought that lots of tiny programs was a better idea than one big one, but is it not time they admitted that it doesn't actually work out in practice, and we fixed this?

And don't get me started on the utterly useless crap that's the rest of the "System" menu!

Problem: too many cooks, no benevolent dictator.


Desktop Linux suckage: window management

If we were talking about Mac OS or Windows, we wouldn't be able to talk about window management. The rest of the world compromised on good-enough solutions years ago, and it hasn't been a problem for them since. On Linux, though, we value our freedom to choose our own window manager more than we value the ability to get useful work done.

I should admit up front to my own past sins in this area; I wrote a window manager (lwm) once, but I stopped maintaining it years ago. I haven't even used it since the end of the 1990s.

At some point I grew up enough to realize that any advantages to be gained through choice of window manager were marginal, and just not worth the cost of being outside the mainstream.

Unfortunately, every few years, the Linux desktop gets a new default window manager. Well, the GNOME Linux desktop, anyway. I wouldn't know about the KDE Linux desktop, or any of the other alternative Linux desktops, but I doubt they're immune to this problem. The fact that none of the choices is as good as Mac OS or Windows' sole choice makes the problem worse, as does the fact that although it's easy to do the first 20%, no-one really wants the 10-year slog to quality, so we start again with a brand new 20% solution.

Nothing really important changes, but the title bar changes from dark blue to light blue, or the close button moves from the left to the right or back again, or the underlying scripting language [you didn't know or care it had] changes from Lisp to Python or Ruby, but never to a generic scripting interface so people can use whatever they like, because those responsible are either not bright enough to realize there's no One True Language, or are too authoritarian to really value any choice but their own. Anyway. That's a separate complaint.

The fact is, that's all you can expect from a change of window manager. Something trivial and unimportant to those who just want to use their computers rather than fiddle with them.

Sure, if you're lucky, there will be one or two minor noticeable improvements, but there will definitely be a whole bunch of regressions (stuff that used to be better and now sucks) that we spend the intervening years until the next regime change arguing about whether the changes are regressions, or whether they're actually improvements, and all the users are just too stupid ("unsophisticated", if we're feeling polite) to understand that we're changing random shit for their own good.

The best part is that, like all software, window managers have bugs. And some window managers are widespread enough, and some applications are caused enough trouble by those bugs, that work-arounds find their way into applications. And you can guess what happens if the window manager ever gets fixed or, more commonly, overthrown.

There was no regime change this year, though one's brewing. First some good news. Some kind soul finally fixed the long-standing problem where a new Firefox window wouldn't get the focus, because of some internal GNOME focus-related turf war no sane user really wants to have to know about. Excellent! I don't know whether it Firefox or GNOME, but someone finally budged, and the desktop is now slightly less annoying for it. In fact, in this respect, it's now back to where it was before the last regime change. Sweet!

That's all the good news I have, though.

Sadly, the current default window manager has a habit of opening every new window in the top-left corner of the display. It's okay for the first couple of windows you open, while there's still blank space on the desktop for it to place windows in without them overlapping, but as soon as a new window would have to overlap an old one, the genius window manager just starts to dump them all on top of each other in the top-left corner.

That's an awesome heuristic. I'm surprised no-one's ever come up with that one before. It does, after all, have the advantage of being zero lines of code. Even if it is pretty much always a bad choice.

This behavior is particularly annoying for me because I have a 30" display, so the top-left corner is on average a long way from where I happen to be. (And even on a 30" display, you can't fit many Firefox windows side-by-side before the screen is full.) If the window manager is going to dump all windows in the same place, how about the center, to be closer on average to my current mouse position?

Searching the web informs me that lots of other people hate this too, and that the fanboy consensus is that you should switch to Compiz (likely beneficiary of the next regime change), which, if you also install and run some special configuration program, has an option to put new windows in the center. A choice which still isn't quite as clever as the Windows or Mac OS heuristics, but would be a damn sight better than dumping everything in the faraway top-left.

The trouble is, Compiz and Java don't get along in a variety of ways (some of them Java's fault, though if there weren't so many X11 window managers, the JDK wouldn't be so full of crufty code to work round their various bugs). And did I mention that Compiz seems to make windows annoyingly hard to grab the edges of, making resizing from the edges hard? And moving from the title bar is hard too, because there you have the opposite problem, where it mistakenly decides that you're trying to resize rather than move? And sometimes clicks on the window frame go straight through to the window beneath? And it causes inadvertent focus-stealing with some apps? And that its wobbly-windowed stupidity makes me apoplectic? (And makes me fear that, yes, it is important to have a paymaster cracking a whip and saying "do useful stuff, not stupid but fun stuff".)

Don't get me wrong, I like the way windows fade out of existence when you close them, I quite like the shadows now they've toned them down, the alt-tab behavior is acceptable, the new window "zoom" effect is okay, and the excessive transparency of title bars is reminiscent of Mac OS 10.1, though I do doubt that the Linux world will learn as quickly as Apple did that it's not such a great idea. (You'd think that one advantage of follow the rest of the world's tail lights would be that you'd see them learn from their mistakes and not blindly make exactly the same ones. You'd be wrong, though.)

Luckily, Compiz is still an option in Ubuntu 8.04, and not yet the default, so at least someone is showing signs of understanding it's not yet of shippable quality. But unless they intend to keep a tight leash on the dog of an interface that's coming our way, a load of things that are working fine will break when the switch comes, and the countdown will start on the next rewrite. That would be par for the course.

Problem: if you keep starting from scratch, you never get far enough to really polish, you waste effort re-doing the bits that were never a problem, and you cause more work (and problems) for everyone else, too.


Desktop Linux suckage: DNS caching

Ignoring all the things that I knew 8.04 outright broke for me going in to this, the first new thing I noticed was that Linux coped a lot less well with Comcast's utterly craptastic DNS servers than Mac OS did.

Judging by my quick experiments, about one in ten DNS queries just doesn't come back from Comcast's DNS servers, so the waiting application just sits there, looking back at me with an unblinking stare until it finally times out, and then fails. I'd noticed this problem before, but not much, because I was still doing most non-coding on the Mac.

I didn't investigate why the Mac copes better, but the always-truthful internets support the obvious assumption that Mac OS and Windows XP have local DNS caches. Presumably Apple and Microsoft got sick of lame ISPs making their OSes look bad. I was certainly quite surprised that my Ubuntu box was doing no caching at all by default (though some applications do their own, to work around this at the level they have some control over).

I installed dnsmasq for local caching, and shortly afterwards switched from Comcast to OpenDNS when I saw no sign of Comcast reacting to Kaminsky's widely-publicized DNS flaw. (Read Kaminsky's slides if you haven't already, even if you think you know what they're going to say. There's a lot of interesting stuff in there.)

The consequence of not caching by default is that Ubuntu doesn't handle real-world shitty ISP DNS servers as well as Mac OS and Windows do. I don't know if this is better in other distributions; a quick web search certainly shows that Linux users have a wide variety of DNS caching alternatives to choose from. I don't know why one isn't running by default.

Passing on DNS lossage to me, the end-user, was the most visible problem when I switched to only using Ubuntu, because it was relatively crippling. You get sick of Firefox's "Looking up" status text very fast. This is a problem because DNS is pretty deep magic to most people. Sure, you can explain it's like "a phone book for the internet", but I'm unconvinced that really helps get the idea across, and more to the point, who's out there explaining DNS to the masses? And why should anyone but a sysadmin care about DNS anyway? Especially when "here's a nickel, kid; get yourself a Mac OS or Windows license" would mean they don't have to.

We can fix this. We even have at least three solutions ready-written and raring to go. We just need someone in power to flip the switch.

Problem: the inmates are running the asylum, and even assuming you can get them to accept there's a problem, it's very hard to get them to agree on a solution.

Ignoring for the moment the question of whether Linux has the applications people need, not coping with common ISP problems is the kind of low-level stuff that makes the difference for most people. This is the stuff that keeps people stuck on (or drives the brave ones back to) Mac OS or Windows. And I'm not just talking about people so many of us feel comfortable looking down upon; I'm talking about people like you and me. Not having the time to track down problems like this has kept me from giving up my Mac for months. Demanding all your users have the ability, inclination, and time to solve technical puzzles is setting the bar pretty high. Almost everyone I know (geeks included) lacks one or more of those things.

And before you blame Comcast rather than Ubuntu, on the basis that it's Comcast's DNS that sucks, bear in mind that they have hundreds of thousands of customers you'd like to have as users. If you think it's only Comcast's DNS that sucks, you're sadly mistaken. Also bear in mind that ISP DNS servers are not going to get fixed. Not ever. As long as Mac OS and Windows users are unaffected, there's simply no motivation for them to do so. And none of these users are going to switch to Linux, as long as it "breaks the web" for them.


Desktop Linux suckage: introduction

Windows 2000
I remember around the year 2000, or whenever it was that Windows 2000 actually came out, I thought Microsoft were doing pretty well. Much as it pained me to say it, I also thought Windows was the best OS available. At the time, Windows was without question the best place to run Sun's JVM, Netscape was running aground, and IE was picking up steam to become the best web browser of the time.

Microsoft really looked like they were getting their act together. Each release of their OS had been convincingly better than the previous one, and Windows had reached a point where not only could I bear to use it, I actually quite liked using it. Outlook Express was fine for my mailing needs. Windows Media Player was fine for my mp3 needs (I had about five mp3s in those days, and as I recall, mainly used my MiniDisc player, a technology you youngsters probably don't even remember). I used my own editor, though it wasn't called Evergreen in those days, I used Sun's JVM, I used the application I was working on, IE was a nice fast web browser, and thus all my computing needs were covered.

I have always been a man of modest desktop application needs. Right now, I see Firefox, Evergreen, and Terminator. Rhythmbox is lurking, System Monitor is running, and that's it. Oh, there's a 24-hour digital clock in the corner.

But back to the year 2000. I'd been using Linux and Solaris on my desktop for the previous five years. Out of love, rather than because either of them was necessarily a good choice, let alone the best choice. Solaris was slow and, in practice, required over-expensive and under-performing hardware. Linux ran faster (even on the same hardware) but was often unreliable, and didn't have a decent JVM, which meant I couldn't actually get much work done.

As a user, too, the situation on Unix sucked. There was no software to run, decent or otherwise, and even the web browser, an application category invented on Unix, was falling too far behind the actual web. Netscape Navigator was bloating up with useless and ill-conceived crap, rather than actually improving in ways relevant to meeting IE's challenge.

So somewhat reluctantly, when a dual-processor Compaq machine (I told you this was a long time ago) running Windows 2000 appeared, I voluntarily became a Windows user. I was instantly happier and more productive than I'd been for years. I convinced myself that Microsoft had turned a corner, and still think of those days as Microsoft's "golden age", but in retrospect, it looks more like it was a temporary blip caused by the combination of Microsoft's successes and their competition's failures.

The stench of commercial Unix rotting from the inside out had just started to reach my nostrils, Linux still wasn't ready unless you had more free time than sense, and Mac OS was still the crappy 16-bit home computer OS from the 1980s rather than the "Unix for the rest of us" it is today.

What happened next
We all know what happened next. Apple reinvented Mac OS, and the web turned into a dangerous place Windows just wasn't ready for. Commercial Unix died, with the arguable exception of the server room. Linux? Well, Linux just kind of muddled along. It didn't get much better, but it didn't get much worse. It was fine for development work, but a pretty lame desktop. Many Unix lovers (and haters, because it's quite hard to be one without also being the other) settled on a Linux box for development work and a Mac for everything else.

That sums up how I saw the pragmatic developer/user's best choices in 2004, which is when John Gruber wrote Ronco Spray-On Usability, an article recently indirectly slashdotted via Why Free Software has poor usability, and how to improve it.

If you only read one of those articles, read Gruber's. It's the best-written and it's especially interesting because despite being written in 2004, it could have been written yesterday. It's also worth reading his follow-on article from back then, which turns out to say much of what I'm about to say.

Sometimes, you're done
If you've read my reviews of the last few versions of Mac OS, you may have noticed some disenchantment. Not that Mac OS is actively getting worse; it's just no longer getting much better, for me. And the few minor ways in which it's getting worse manage to cancel out the improvements to the extent that, for example, I've stuck with 10.4 on my MacBook Pro at work, despite (or because of) the fact that I've used 10.5 at home. The truth is, 10.3 was pretty much good enough. Good enough to make it hard to add USD120's worth of value.

If you're not familiar with Mac OS, but are familiar with Microsoft Office, you've probably known a similar feeling with Office "upgrades". The few people I know who've claimed to like a version of Word, for example, tend to like the oldest version they came into contact with. Everything since then has seemed superfluous to them; more trouble than it's worth. We know many developers/managers fail to realize that good software takes Joel's ten years, but I think even more have trouble accepting that there does eventually come a day when the software is "done", and most changes you can make from that point will make the product worse for as many people as they make it better.

The arrival of that day presents a major problem for a commercial entity; it's seriously non-trivial to come up with a significantly better way of doing something. Since check-as-you-type spelling checking, for example, what's really made a significant difference to word processing? Word processing is "done" (for now), and Office 2008 is living proof that Microsoft can't afford to accept that. Mac OS, seemingly, is "done" (for now, on the desktop), and Mac OS 10.5 is living proof that Apple can't afford to accept that.

[My dad just called to tell me I'm ignoring Time Machine, which is his new favorite thing. Maybe he's right, maybe he's not. The fact that Mac OS (or Word) is good enough that we can even seriously have this argument is telling. We could hardly be having the same argument about desktop Linux, could we?]

Amusingly, though "done day" would be a time to rejoice rather than worry for Free developers, there's no real sign we're getting anywhere close to that day, despite optimistic pronouncements to the contrary. Sure, Linux has got better since 2004. As a developer, I no longer want to use anything else. OpenJDK even has me actively excited about further improvements in the near future. Life on Linux is sweet for me, as a developer. But as a user?

Ubuntu 8.04
Non-developer progress has been slow, and patchy, and sometimes seems to go backwards. I'm starting to wonder if somewhere between Ubuntu 6.06 and Ubuntu 7.10, we didn't see Linux's "Windows 2000". Ubuntu 8.04 is a drag. So much stuff that used to work no longer works, I may as well be using 64-bit Vista (or whatever it is that's supposedly causing Windows users pain at the moment). I hear people arguing at work about whether Flash works or not. Audio. DVD playing. CD burning. Sleep/hibernation. Apple keyboards.

I've had problems with all of the above, minus audio. Despite audio being one of those in-jokes amongst Linux users, I've never actually had any trouble with it. (Though come to think of it, I've only ever even bothered to try on one of the ten or so Linux machines I've used in the past decade, so maybe that doesn't mean anything.) All of these things (minus Flash) worked in 7.10. All (minus Flash and sleep/hibernation) worked in 7.04. I had high hopes for 8.04. This was obviously going to be the one. A "long-term support" release (where "long-term" means 3 years) coming 6 months after a really strong release. What could possibly go wrong?

I upgraded, and a couple of months later, I stopped using my home Mac. Time, I thought, to make Linux my OS of choice not just for development.

It's interesting to reflect, at this point, upon the fact that I eagerly upgrade my Linux box every six months. Because the new stuff is so great? No. Because the old stuff is so bad. There's always something that's causing me enough pain I want to upgrade to make it go away.

And that would be fine, if upgrading didn't always bring new pain. In the next few posts, I'll talk about how it felt to finally switch to Linux for all my desktop needs.


Making JTable look more at home on Mac OS and GNOME 2

Back in 2006, I explained how to make JTable look more at home on Mac OS and GNOME. Since then, Mac OS 10.5 added support for the smaller-sized combo boxes I bemoaned the lack of back then, and I updated the code in Subversion to take advantage of that, but didn't make another post.

At the beginning of July, Rob Ross mailed me to say he'd found a work-around for a problem visible in the screenshot in the original post: the unwanted focus ring around the scroll pane. In his own words:

I discovered that this method would add a Border to the enclosing JScrollPane if the scroll pane's border was null. Well, I had already explicitly set that border to null in an attempt to get rid of it. But when the JTable becomes displayable, this method gets called, and re-installs the default focus border.

So all you have to do in your code is re-set the border on the scroll pane to null, and that focus rectangle goes away!

I was my usual slow-to-answer self, but despite my accidental rudeness, I got another mail from Rob later in July with a follow-up:

I have been informed by some Sun Swing guys that the "canonical" way to handle this is to set an EmptyBorder on the [JScrollPane] when it's created; that way a border will not be added ... and no overriding of configureEnclosingScrollPane() is required for this particular issue, although you would still need to override it to get the other effects you're implementing.

The trouble with the Swing guys' suggestion is that, although you avoid having to subclass JTable, you have to fix every JScrollPane you use to contain a JTable. And as Rob rightly says, I still need to subclass JTable to fix the other problems.

Given that, I've committed Rob's first suggestion. See the latest src/e/gui/ETable.java for the whole thing.