I often mention Joel Spolsky's article Good Software Takes Ten Years. Get Used To It. It's one of the best things he's ever written. One thing Joel doesn't promise in that article (or any other) is that the ten years it takes to create good software will be ten years of fun. He doesn't promise ten years of interesting code or ten years of great new features. Quite the opposite, in fact.
Most of your ten years will be spent on problems with Danish keyboards on Mac OS 10.3, or fonts on the Thai version of Windows 98, or some bug in some IMAP server that this one guy in Kabul sometimes has to connect to, or just making your error messages clearer, more specific, and more focused on helping the user fix their problem.
One of the big things that Joel explicitly states, because he has to, is that contrary to what we leave university thinking, these "hacks" are the most important bits of code we write. They aren't nasty bits of cruft that we should be marching out into the world to do away with, in order to make the world a better place. They're nasty bits of cruft that we're marching out into the world to add, in order to make the world a better place. They make stuff "just work", even though the hardware's broken. They make stuff "just work", even though the software they have to interact with is broken. They make stuff "just work" for most users because making 80% of your users happy is better than waiting for the perfect solution to come along and making no-one happy in the meantime.
The real world is full of lossage. The very phrase "real world" is practically defined by an admission of the existence of lossage. Industrial quantities of the stuff. But the academics mislead us, wanting to reshape us in their own image, and would have us believe that what the world needs is more beautiful theories and, maybe (and only if you can do it without getting your hands dirty) a partial implementation of the beautiful theory. They can only say this with a straight face because they are, at heart, ascetics. That's why they're in universities, for dog's sake. I'm not specifically talking about desktops right now, but their idea of a desktop is probably a single copy of vi running in a single XTerm, where they're editing some 10-line Haskell program with really clever puns in all the function names.
The presence of puns and absence of side-effects is what passes for quality in their world.
By the time we realize these people were lying, or deluded, or naive, or whatever, it's usually too late. We're too busy to give people what they want or need for free any more, so the Linux world has to make do with what our younger selves thought people ought to want. A hundred half-finished flying machines and not one working toilet.
Welcome to paradise. You might want to consider getting constipated.
There's a picture in Apple's HIG, in the section Prioritizing Design Decisions. I'll describe the picture in case the link breaks, or you just can't be bothered to follow it. The picture shows three blocks stacked atop each other. The largest block, on the bottom, is "Minimum Requirements". The next block up, somewhat smaller, is "Features users expect". The top, smallest, block is "Differentiation". The idea being that you start with the big block on the bottom, and work your way up.
Again, no genius. No "rocket science". Just common sense and good engineering. (If you actually read that chapter of Apple's HIG, you'll find that what they're actually talking about is very pedestrian stuff indeed. But at least it's an ethos.)
Problem: Linux desktop apps don't care about "Features users expect" or "Minimum requirements" at all. They may offer some "Differentiation", but not necessarily. Mostly, you just get "The bits that are fun to code".
I said Joel didn't promise the 10 years would be fun. There's something else he didn't promise. He didn't promise that 10 years would be sufficient. Necessary? Yes. Sufficient? Clearly not.