Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Linus Torvalds, Alan Cox respond to Miguel De Icaza on Linux Desktop woes (itwire.com)
162 points by emkemp on Sept 2, 2012 | hide | past | favorite | 173 comments


The actual conversation is more interesting and revealing than the blogspam on top of it (toolbar I have to dismiss? really?). This quote is noteworthy (from Linus):

> I personally think that one reason that the Linux kernel has been so successful was the fact that I didn't have a huge vision of where I wanted to force people to go

I agree. Thing is, this just doesn't work when it comes to user interfaces. You end up with a morass of principalities and wanna-be-kings who all have their own different view of what Linux on the desktop should be. This is an area where consistency really matters (to users).

What's more the primary contributors are engineers who are unrepresentative of any kind of mass user base (potential or actual). People who want to be able to customize everything when no one else cares.

Ultimately I think the pseudo-anarchic development process doomed Linux on the desktop but some external factors didn't help, namely OSX. OSX really is "good enough" for anyone wanting a nix desktop/laptop. Sure it has problems but it means the gap between it and Linux is just that much smaller. And with Linux you give up great hardware (to varying degrees) and a snappy and consistent* UI/UX.

The company that has had the most success putting Linux interfaces in people's hands is, of course, Google with Android (disclaimer: I work for Google). And even there the principality problem has created a fragmentation nightmare as different vendors seek to "differentiate" themselves.

Phones and tablets of course don't have the same expectation and technical legacy that desktops do (from X11 on up). Hell, Apple did the same thing with iOS (essentially forking OSX).

Audio is another problem. Peripherally I've had to deal with PulseAudio and it really is a solution looking for a problem. Graphics drivers are another issue. AMD/ATI and nVidia have of course not helped matters here.

I think Canonical had a chance to make a real impact here but they essentially blew it.


> I think Canonical had a chance to make a real impact here but they essentially blew it.

I keep seeing claims being made, and mostly by (former?) Linux superusers, that Canonical and Ubuntu are "doomed". Could someone point me to some numbers that show signs of this?

I know when the Unity desktop came out, a lot of people were flaming. I didn't install it on my laptop (I'm still on 10.04 LTS), but I did on a colleague's desktop. From time to time, I have to assist him with code he's writing. During those sessions I have to interact with it and to be honest I'm not pleased, but each time I ask him if he wouldn't be interested in switching to Gnome or maybe installing Mint or something, he keeps saying it's not that bad. I'm sitting next to him and frustrated because I can't find where everything is and how to bring stuff up, but when I ask, he just presses a few keys and whatever we need shows up on screen.

When Miguel de Icaza's post came out last week and I saw claims on the doom of Ubuntu, I started looking for evidence. I couldn't find anything that in fact confirmed this. I found a lot of people who complained about Unity last year when it first came out, but fast forward in 2012 and there are also more and more posts of people going I use to hate it with a passion, but I have to admit, it's now growing on me. Not everything is perfect, but with a bit of work I think they're on to something. Also interesting to me was that I had a sense that the most positive reviews came from people with no prior Linux experience. So much so that I'm getting more and more curious about it and I'm slowly mentally preparing to make a switch to it.

Is it possible that Canonical is actually moving away from Gnome and taking things in-house to avoid exactly the problems that Miguel de Icaza is underlining?


Thanks for this comment. I personally think Unity kicks ass. It is the best out of the box experience I ever had with a desktop os.

Of course my heavily customized gnome 2 were better suited for me, because every detail was tweaked by myself. Unity I can use out of the box and it is working really fine.

I sell laptop with Ubuntu 12.04 and I have a return rate of 3%, normal return rates are 20 to 30 %. It can't be that bad for normal users.


How can you live with having to search everything? Gnome-shell gives you a nice list of all your applications and you can switch between categories. The biggest bug since the introduction of Unity besides trying hard to make you remember application names is that F1 no longer opens help. The question mark symbol has also disappeared from the top panel.

After grudgingly Googling how to use Unity, I found the steps to list applications and it's absolutely horrible. Here's how:

  Right-click the Ubuntu logo*
  Choose applications
  Click "Filter results"
The buttons presented are individual toggle-buttons, not the usual switch buttons. Turn the next category on, turn the previous category off. Madness. And on top of everything that's badly designed, it's also unresponsive. A lot of clicks are missed and this is especially noticeable with the workspace switcher.

For now, there's the option of installing gnome-session-fallback. It introduces a "GNOME classic (no effects)" option in LightDM which works well and has the old, functional workspace switcher. Still no F1 for help, but it sits under Applications > Accessories so it's relatively discoverable. The panels can be customised by holding down ALT+Super while right-clicking them (thank you again, Google).

Sadly, Gnome is deprecating the fallback session in favor of a software rasteriser ("llvmpipe") to run Gnome-shell where a hardware rasteriser is unavailable. The next release will also see the lock screen moved into the shell so it'll look pretty, but it better be the most stable release they've ever had since the last 2.x. Sorry to be cynical, but I doubt it.

*: This doesn't work in Unity 2D. Tough.


How can you live with having to search everything?

(Disclosure: satisfied Unity user here)

If you interact with the machine primarily through the keyboard, "search everything" is a plus, not a minus. It means you can easily launch new applications without having to take your hand off the KB and move it over to the mouse. Launching applications is much faster for me in Unity than it was in GNOME 2.x, simply because typing ALT plus the first three characters of the application name (or a generally descriptive word: ALT + "music" finds Rhythmbox, for instance) is faster than rooting around a multi-level menu.

After grudgingly Googling how to use Unity, I found the steps to list applications and it's absolutely horrible

This is because "listing applications" in Unity is a failure case. Users don't want to list applications; they want to launch the one application they were looking for. The point of Unity's launcher is to make digging through a long list of apps to find the one you wanted obsolete.


So it takes Gnome Do and makes it more complex? I think that would make Ubuntu quite unique and thus back it into a corner they invented. It's not like the GUI is a new concept these days, designs have long been ironed out. Old Gnome was fine and I would've settled for something that resembles Android 3.0+, sorry to rant.

"listing applications" in Unity is a failure case

My desktop takes the logical next step, before I moved back to gnome-panel I simply dragged the applications I used out of the overlay thing onto the desktop. I keep the icons organised, so they're always in the same place. Anecdotal observations from "normal" people show that they use their Windows Vista/7 desktops much the same way, not even bothering with the start menu except to shut down. They get grumpy whenever anything is different at all in which icons are shown and in what order. Their allegations that I broke something weeks after I accidentally left, say, a copy of unzip on their desktop is of course laughable at best. But the concept of having shortcuts on the desktop, certainly not. Makes me wonder why Unity bothers with a separate desktop at all, it could benefit from the screen real estate when that overlay screen sits at the bottom of the stack as the root window. I loved the netbook-launcher it grew out of, too bad it was discontinued.

Users don't want to list applications; they want to launch the one application they were looking for

The Gnome consensus used to be that users don't care which application they're using, they just want to carry out a certain task. Which is why they all have boring names such as "document viewer" and "web browser". How is a user going to know to find evince and epiphany or firefox? In Unity, search quickly breaks because of this. If you know the name of the application, it can find it. But if you don't, it doesn't map keywords to application names. It would be impossible to maintain appropriate keywords for all applications in Debian (and by extension, Ubuntu) so it doesn't.

Fortunately, all the current desktops run a terminal by pressing CTRL+ALT+T. It's the one failure mode they all support equally well, supports tab completion and isn't limited to applications with a .desktop file. This is what I do for most tasks besides web browsing, so even after all these years of developing desktop environments the main purpose is still to run multiple terminals side by side. If only twm had niceties such as network manager and removable media.


So it takes Gnome Do and makes it more complex?

No. I had doubts last year about the wisdom of Canonical starting from scratch with Unity rather than just building on GNOME Do (see http://jasonlefkowitz.net/2011/04/ubuntu-11-04-everything-ol...), but the last couple of releases have put my doubts to rest. Unity is everything GNOME Do is/was, but now with the possibility to extend even further into things like the new HUD (https://wiki.ubuntu.com/Unity/HUD), which does to application menu bars what Unity does to application lists.

They get grumpy whenever anything is different at all in which icons are shown and in what order. Their allegations that I broke something weeks after I accidentally left, say, a copy of unzip on their desktop is of course laughable at best

You seem to be under the impression that the Unity desktop can't have icons/files/launchers/etc. on it. It can, mine does. It's accessible via the usual "Desktop" folder in your home directory.

If you know the name of the application, it can find it. But if you don't, it doesn't map keywords to application names. It would be impossible to maintain appropriate keywords for all applications in Debian (and by extension, Ubuntu) so it doesn't.

You're incorrect here too. Unity does support searching by keywords, and all you need to do to hook into it is create a standard freedesktop.org-style .desktop file for the app (http://standards.freedesktop.org/desktop-entry-spec/latest/a...) and fill in the "Keywords" field inside it. Unity will pick up those keywords and use them when searching in the application lens. Most of the commonly used applications already have the most obvious keywords provided for them.

(Putting the keywords in the desktop entry file means that Ubuntu/Canonical don't have to maintain a master list of keywords for every application; app developers can just provide relevant keywords for their app in the DE file they ship, and users can tune them if needed just by editing the DE file.)

I get the feeling from your complaints that you checked out Unity briefly early in its lifecycle and haven't checked back in on it lately. You should try it again now, you might be pleasantly surprised.


Unity is everything GNOME Do is/was, but now with the possibility to extend even further into things like the new HUD

I never got the hang of Do, either. In order to do things with do, you have to learn its vocabulary. Whenever there was something I didn't know how to do with do (or it didn't have a certain plugin), the old way of doing it was still there. This made me reluctant to ever press the button that triggers it. Maybe I'm bad for not investing time into making it work better, perhaps desktop environments are not being treated fairly in this regard. To clarify, maybe it's hard to unlearn the ways of interfaces which came before. I was taught how to browse the web with Netscape and today I'm still bitter about certain changes in Firefox (I know there's seamonkey but it's a second-class citizen nowadays). This doesn't hold true for all user interfaces, but I'm not sure if age plays a significant part (as in, learn UI X before age Y => forever stuck with the concepts of X), prolonged use or a combination of either along with not having any alternatives (thus, not building the required mental abstraction layer to differentiate the underpinnings of various UI concepts). I have only myself to sample and given that people are using user unfriendly (or even hostile) window managers such as evilwm, fluxbox and xmonad, there have to be wildly different "mental operating systems" in any representative sample group.

Gnome usually worked well despite this psychological hellhole, I could introduce people to it and it generally wasn't met with contempt. They knew their users so well that I, for one, was shocked and baffled when at the first command I entered in the preferences of the panel applet Fish, it berated me for trying to make it useful (there's only 1 command which does that).

Most of the commonly used applications already have the most obvious keywords provided for them.

I see, that's great. Sadly this hasn't worked for me, but I'll keep this in mind whenever I'm searching in Unity and something is missing. I hope seeing how it works hasn't skewed the way I use it too much.

app developers can just provide relevant keywords for their app in the DE file they ship

Yes, I hope package maintainers will do this too. This can be useful for a11y and other desktop environments, nice.

I get the feeling from your complaints that you checked out Unity briefly early in its lifecycle and haven't checked back in on it lately.

This is false. Although I have tried Unity in each Ubuntu release starting from 10.10, I've tried hard to work with it (instead of replacing it outright) in 12.04 over the course of months. I turned to Google just to figure out where my applications are for starters, instead of flipping tables. What recently drove me away is the instability, not the glaring usability quirks. At some point, compiz flat out refused to start but compiz 2d still worked (for a while). Compiz stability is very hard to fix as it can even depend on obscure GPU bugs only found in serial number x through y of model z manufactured by {.

However, I did manage to find one improvement over gnome-panel: it's easy to use the keyboard to navigate panel applets/indicators. Just by pressing e.g. Alt+F, you can move (using the arrow keys) all the way to the power button. A lack of pointer input always renders gnome-panel useless.


> The Gnome consensus used to be that users don't care which application they're using, they just want to carry out a certain task.

Still just press the Super key type in "doc" and you will see the "Document-Viewer" which is evince.


For Help press ALT + F1.

You can see most shortcuts with holding down the Super-Key for a few seconds.


This does not actually work, it merely highlights the Ubuntu button in the dock. According to the shortcuts overview, it's "Open Launcher keyboard navigation mode". I ran unity --reset to make sure it's not just me.


Thanks! In gnome-panel, this brings up the left side menu.

I'll try the help whenever I'm in Unity again.


What really bothers me about Unity is that I haven't seen it work. I've got a quad-core machine with a decent NVidia GPU and 4GB of ram, and I was seeing multiple seconds of UI lag, even on the 2D mode.

I switched to Xfce and haven't looked back. Everything is snappy, and everything is configurable.


The Unity interface is incredibly cool. Love it, love it.

Performance was a bit choppy when I had 4GB ram, and sometimes compiz would freeze my system. However, upgrading to 8GB RAM (costs about $50) totally fixed everything for me. Just upgrade.

I'm sure they could optimize things better, but that would slow down their frantic pace of innovation.


Just upgrade.

Why? Why should someone upgrade their ram just so the UI works smoothly? We've had smooth UI's for twenty years, even back when 32 MB of ram was the norm - so why should someone upgrade to 8 GB just for the UI? Hell, if I upgrade my ram (I have 8 already, but anway) I'll want it to be to benefit all the applications I like to run and I expect the UI and OS to use as little as possible and still perform smoothly.

My old eee pc was able to run windows xp just fine before I installed Arch with a tiling window manager on it and everything ran well, everything was fast, everything was smooth - for about three years - and then I installed Ubuntu and Unity brought it to a standstill despite not using any different applications than I always used. Even switching workspaces sometimes took 15 seconds or so. Yes, it only has 1 GB of ram, but like I said, I used the same applications that I always did without problems. I struggled with it for a few months and switched back to Arch and haven't had any problems since (though I now use a laptop which does have 8 gigs of ram and rather than using that ram to make unity run smoothly, I'm using it so I can run windows 7 and linux at the same time with VirtualBox).


I love Xfce myself, but Unity on 12.04 feels really snappy on my System76 laptop.


Maybe a config problem ? It works perfectly fine on my old Core2Duo laptop, with 4Gb RAM. Almost never experienced multi-seconds lag (but I do admit I see some crashes here and there) in the latest Ubuntu release.


This, IMHO, is the problem right here in a nutshell.

Maybe it's a config problem. Maybe it's not. Maybe it's sunspots. Who cares? It doesn't JUST WORK and that - more than anything else - is what will doom Desktop Linux.

I run Ubuntu on what essentially amounts to a media desktop. It does work, but that's because I spent hours on researching the simplest, cheapest hardware config that would work seamlessly with Ubuntu without requiring me to compile my own drivers, etc. The fact is that I, even as a power user, really don't want to spend hours messing around with that stuff... so I can't imagine the Average Joe would want to.


Oh, but that has been an issue since the ancient times of the beginning of Linux. Being it monitors/screens, (win)modems, audio, etc.

Slashdot comments are full of "Works-for-me" replies to real issues with varying levels of "you are lUser" for not being able to make it work.


Predictions of Linux's doom have also been there since the ancient times of the beginning of Linux.

Actually, back then there was much more reason to doomsay, as there really wasn't a desktop to speak of (CDE on Solaris was far more sophisticated than bare X on Linux), no hardware manufacturers even bothered to provide even closed-source drivers for their hardware, hardware support was miniscule, there was no such thing as auto-detection of hardware, web browsers didn't even exist so apart from a handful of applications with pitiful GUIs by today's standards, every user was forced to interact with the system through the shell. There was no OpenOffice or LibreOffice (ie. no option to use a Word/Excel/PowerPoint alternative on Linux), no web browsers, no desktop environments, no way to configure your system through a GUI. The list goes on and on.

Since those days, Linux has improved by many orders of magnitude. And it keeps getting better every year.

I think most people complaining about Linux these days really don't appreciate how bad and comparatively unusable it was in the old days. Back then you could not even dream of making any kind of comparison between Linux and Windows or MacOS in terms of ease of use for an average non-technical user. Now from an average user's perspective, they're all quite similar, with some minor differences on the periphery.

Sure, Linux still has some problems, but it's not like the competition is without its own problems.

Yes, sometimes a user who bought some obscure peripheral (made by a manufacturer who does not care about Linux compatibility) doesn't bother to check to see if the peripheral was listed as being Linux compatible. So they may have problems getting it to work.

But how do you think the typical Windows user likes using a virus and trojan infested system? Even Mac users are starting to fall victim to this problem. Or how do you think a typical Windows user enjoys having to reinstall the operating system every couple of years because it's slowed to a crawl through Windows-bloat?

"Slashdot comments are full of "Works-for-me" replies to real issues with varying levels of "you are lUser" for not being able to make it work."

Slashdot is not representative of the Linux community as a whole. And my experience in asking questions of Linux users has been quite different from yours. Virtually all Linux users I've encountered anywhere, from usenet to web forums to irc to mailing lists, have been exceedingly helpful. Even on Slashdot, there's quite a lot of help to be found.

And what could be more helpful than volunteering to write whole applications, whole operating systems, and even the documentation to go along with them for free? Countless Linux users have dedicated years and even decades of their life to helping the community by doing this.

The mercenary, closed-source culture that dominates in the Windows and Mac worlds is shamefully lacking compared to the vibrant open-source world that thrives in the Linux community. As a Linux user you are gifted with an entire free operating system, tens of thousands of free applications, hundreds of free device drivers, your choice of free desktops, etc. In the computer world, is this not the ultimate in generosity?


HOw can it spell the doom of Linux desktop? Even if it does not worth with 100% of the folks, it works for a certain % of people, and the total installed base is growing over time as more people try it and find it works for them. That is why you see the share of Linux desktop not GOING DOWN, but being stable or growing a little, while the overall internet population is growing from year to year.

Please stop the "Doom predictions". It won't happen. It does not need to be the MAINSTREAM system to exist.


Ok, perhaps my usage of the word "doom" was a tad strong. I should have said that it will relegate desktop Linux to a tiny (miniscule) percentage of the mainstream.

And that's a shame, because of the wealth of great, free tools (whether it's development tools, photo editing tools, audio editing tools, etc. etc.) in the Linux ecosystem.


I really don't think it's any problem with Desktop Linux. I accept that it's going to take a few hours to tweak a new machine to my liking. Linux makes it really easy to swap out a buggy or uncomfortable windowing environment. A lot of manufacturers seem to be going in the opposite direction, baking the windowing environment so deeply into the firmware you can't get rid of it. Linux has always been niche, and it's one I'm pretty happy in.


>Is it possible that Canonical is actually moving away from Gnome and taking things in-house to avoid exactly the problems that Miguel de Icaza is underlining?

I do think this is part of it though they probably wouldn't say it way. They certainly do want a consistent user experience - and they want one that rivals OS X.

I don't understand why people complain about using software they haven't learned how to use yet. It all takes some getting used to for a new user, including OS X. Now I can definitely understand the frustration though of someone who just upgraded their OS only to find they don't know how to use it anymore.


People complain about such software because we live in a world where there is no such thing as a Computer Driving Licence. Many users are perfectly fine to live in as much ignorance as possible, and are actually quite afraid of traditional computers. These users can easily be spotted at a glance on both Windows and Mac as they invariably use their desktop as a horrifically chaotic filesystem, never delving deeper, and the internet is essentially Facebook.


Actually it does exist: http://www.ecdl.com/

The world is not better off with this, as it merely teaches people how to use MS-Windows and MS-Office. It doesn't go into any detail at all how a computer really works.


Wow, you learn something new every day! My thinking was more akin to a real life driving license, a mandatory thing you would need before being allowed on a computer unsupervised etc. Thanks for enlightening me though!


> I don't understand why people complain about using software they haven't learned how to use yet.

you shouldn't have to learn how to use the desktop! I tried out Unity, I really did. I didn't like it and my apps kept getting lost when minimized and I couldn't have lots of small windows open and it just fixed something that wasn't broken for me. Like everyone else, I went to Xfce.


But you do have to learn how to use a desktop. You even spent sometime learning how to use a mouse. I use xfce on my main machine and unity on the oops I need skype machine and I hate the switch, but that's because I have not invested time to use it well - and chances Are I probably won't, just like I won't learn the annoying parts of OSX

desktops are a tool like any other - the mentality that they should be intuitive or easy leads down the path of eye candy as opposed to functionlity.



when you learn to drive a car.... you can drive almost all cars... the biggest difference is with the manual o automatic gears.... when you learn to use a computer you should be able to use many of them whithout having to learn anything.


A car does one job. A computer does many jobs. Learning to do each job a little differently, adds up to a lot of learning. I don't know what "you should" has to do with it - I'm just pointing out that for all desktops that currently exist, you have to learn how to use it.


So you are going to make the claim that xfce does not require learning? Or just it doesn't require learning if you already knew gnome, or what? Because no learning is a pretty strong claim...


Yes, I agree, but they're going the same path as Gnome, the "we know better"

"one that rivals OSX" what a joke. Windows XP is a better rival to OSX than Unity.


Actually I think it is the other way 'round. OS X has preatty lousy UE by default.


> with Linux you give up great hardware (to varying degrees)

I assume you're talking about Linux driver issues. But since we're talking about OSX vs Linux here, it's only fair to note that Linux supports a much greater variety of hardware than OSX. Linux also tends to support the very newest hardware more consistently than OSX (perhaps excluding video cards). See eg the laughable specs of Mac Pros.

Software also benchmarks faster on Linux vs OSX on the same hardware. So there is still an opening on Linux for people concerned with performance.

But yes, the Linux desktop is still light-years behind in terms of design, for the reasons you mention.


it's only fair to note that Linux supports a much greater variety of hardware than OSX.

Entirely true, but I also don't have to look up what is or is not supported by OSX. I just buy an Apple computer. Of course, there's obvious tradeoffs here about having the freedom to buy whatever hardware you want.


Ubuntu has initatives to let you just buy a computer with Linux preinstalled too!

http://www.ubuntu.com/certification/


That hardly an achievement of OSX though. Supporting pretty much a single set of hardware and some variations isn't really hard and does not count as "great hardware support" in my book. That's just one of the advantages you get by selling the devices and the software.


I don't disagree, but as a consumer, I just care that it's easy.


> I assume you're talking about Linux driver issues. But since we're talking about OSX vs Linux here, it's only fair to note that Linux supports a much greater variety of hardware than OSX.

I'll take supporting a limited set of hardware very well over supporting tons of hardware often in a mediocre at best fashion. I'm not just talking about video cards either, over the years I've had "fun" with everything from SATA controllers to sound chips that were 'supported' under linux.


If you're fine with limited hardware just use hardware that is supported by Linux. Every Thinkpad I've used so far worked perfectly out of the box. If you're not sure what to buy just look at the list of Ubuntu certified hardware: http://www.ubuntu.com/certification/


Honest question: isn't hardware supported by Linux strictly a superset of OSX? I.e., is there any Apple computer you cannot install Linux on? I've been running Ubuntu on a mac mini and I know others who do the same. Is there a problem installing Ubuntu on a macbook or an imac? If not, then this whole subthread is moot, no?


IIRC, there are issues with the latest Retina MacBook Pro.


Usually, yes. For instance Ubuntu runs perfect on the latest Macbook Air.


Speaking of Thinkpad, there's also http://www.thinkwiki.org


Thinkwiki is absolutely fantastic. I think that may just be the thing I miss most about not using a Thinkpad anymore.


I'd rather spend time building. The value prop of doing the research isn't worth it when I can just buy a Macbook Air and not even have to think about it.


Frankly I have never done any research at all when buying laptops intending to use Linux on them. Maybe I've just gotten lucky, but I really don't think so.

The last computer I purchased I purchased at a bar on my phone while rather intoxicated. The only thing I did was sort newegg's laptop selection by "cheapest first" and bought the cheapest. I don't think I missed out on any "building" time... If what everyone says about Linux hardware support was true, there is no way in hell that should have turned out fine.


Oh, statistically speaking I reckon this could work out just fine. However, Linux hardware support is a lot better than some claim here. The average desktop with it's x86 architecture, usually on-board sound and some mediocre GeForce is supported just fine. Seriously, that's what most people use.

Concerning the video cards I think people miss out on the proprietary drivers from Nvidia here which have always worked brilliantly for me since mid 2007 or something. Yes, they are proprietary but so what? As far as I know ATI cards work pretty well too.

Linux sound support works fine as well, it's still mostly ALSA underneath calling the shots which works like a charm. PulseAudio, I admit, usually doesn't. You always have the option to remove it from your system though - or even better: Don't install it in the first place. The actual drivers are in ALSA though, so nothing to complain here either.

CPU and RAM support is a no-brainer with linux. Never had any issues. I actually had a ton more issues with it under Win7. Recently plugged some 32gig additional memory into a workstation (64GB now) and Win7 only accepts 48 of it. Booting into Linux everything works and I've got the full memory capacity at my fingertips.

So I wonder, what is this all about? I've been using Linux on quite a few pieces of hardware and never ran into any serious issues. Yes, I had to screw with the X config a few times but that doesn't file as "not supported", just as "stupid defaults".


Me too. I typically always buy Thinkpad X-Series laptops. They always worked fine with Linux and I didn't really have to do much (if any) research.

So you buy Macbooks because they run OS X and I buy Thinkpads because they run Linux. Where's the difference? (Except that I would also be able to run Linux on other types of laptops if I wanted to).


Not the one you're responding to, but: I guess the difference is the preferred pointer device plus the screen quality. I have to tell you though, I've seen quite a few posts of people switching from thinkpads to macbooks and not one of them was looking back to the thinkpad's pointer nipple, at least not if they have been using a macbook after a while. All other differences of those two products are basically a matter of taste IMO.


Here's a datapoint: After buying a 13 inch Macbook pro (and having had it stolen), I went back to the T420. I miss two finger scrolling at times, but I love having page up and down. I love using the pointer the pointer and scroll bar without having to leave the home keys. When I was on a recliner or laying on a couch, using the trackpad was often annoying and uncomfortable.

Option-up/down doesn't replace an actual Page Down Key.


It's certainly interesting that it's more comfortable for you to use Page Down instead of two finger kinetic scrolling. I only find myself using the Page Down keys on Desktops instead of the mouse wheel. Well as I said, it's basically a matter of taste.

Btw. how are the screens these days? I've read that the resolution is quite low, but do they at least have good contrast / color spectrum / brightness / viewing angle? I'm asking because my father wants to buy one again (low res is actually a plus there).


As for page down, I think the nicest piece of it is that when I hit page down, I know exactly where to reset my eyes when reading something that's longer. Now, space usually worked in safari, but it was suprisingly unpredictable. Of course, pages meant using option+up/down or scrolling.. We all have our quirks, I guess. :)

I have the upgraded 1600x900 screen, so I can't speak to the quality of the lower resolution screen. I've heard it isn't as good, but I will say that for programming, the 1600x900 has been great. (I forgot to mention how much I hate glossy screens as opposed to matte...) From everything I've heard, the upgraded screen is much better.


Thanks for the heads up. I'm not sure yet whether 1600x900 is the right choice for him; Windows software is still not as resolution independent as it should be and he needs rather large text. I think that the retina pro would actually be quite good in that aspect because the scaled resolutions are very well implemented there, however $2.3k for my father's usecase is quite a stretch ;).


We come from different socio-economic groups. I simply can't fathom the idea of making a $1k purchase so casually that I couldn't even be bothered to do a bit of cursory research. Even 10-20 minutes of light googling is too much for you? How much can you actually build in that time?


Maybe you're happy with that but there are others who want a decent package manager and don't want deal with the differences in OSX vs Linux.


> I'll take supporting a limited set of hardware very well over supporting tons of hardware often in a mediocre at best fashion

Ha! Check any recent thread on Mountain Lion updates and you'll find a ton of non-"very well supported" hardware glitches.


> I think the pseudo-anarchic development process doomed Linux on the desktop

I'll repeat myself here. Linux is the natural successor of the technical Unix workstation. It's, therefore, a niche product: built by a highly technical crowd for a highly technical crowd. Only recently efforts were made to make Linux more muggle-friendly and those efforts nearly coincide with the decline of the relevance of the desktop PC.


You must have never seen Corel Linux, Caldera OpenLinux, Lindows/Linspire, etc.

Many attempts were made to make Linux more user-friendly at the start of the century (and ever since).


I did see those, indeed. Caldera's was somewhat successful in making Linux easier to install - at least when compared to the Debian and Slackware installers of the time. At least, I managed to install it on my machine. I would also mention Conectiva's as an easy to install and use Linux.

Unfortunately, at that time, the lack of desktop applications was a huge problem for the average user. Today, many people read e-mail, collaborate on documents and engage in a large part of their social lives through a browser. That was not the case in the early 2000s. Having an easy to use (or Windows-like) file manager, application launcher or even package manager was not enough.

Perhaps it's unfair to give credit to Canonical for making Ubuntu grandma-friendly when a lot of its ease of use comes from applications like LibreOffice, Thunderbird and Firefox (and Gmail, Gdocs and Facebook).


I think Canonical had a chance to make a real impact here but they essentially blew it.

Curious to hear you elaborate on what they blew, and how. I don't use Ubuntu myself (prefer straight Debian), but I've always respected what they're trying to do (and to some extent, what they have already accomplished).


Linux is not a corporation. It's not fighting for turf. If Mac OS X is good enough for most of the desktop users, Linux doesn't have to throw resources and bullshit around (as corporations do) to counter that success. As long as Mac OS X does its job well Linux can focus on being linux-y, and succeed.


Since it's customary to bash on PulseAudio: Here's one user that is very happy to have it, using its end-user features.

I wouldn't want to have a machine again, where any of these things won't work:

- Set volume per source/application (More flash video for the presentation, some mp3 still in the background)

- Move applications between devices (Wife complains about the sound of my video call/video/game? Move that thing to the USB (!! No jack that directly does what you want anyway) headphones)

- Normalize output so that nothing blows up my headphones when I switch between different audio sources

Apart from that I did use the 'stream over the network' feature a couple of times in the past and while it wasn't something I'd need on a regular basis it worked fine for me.

Long story short: Pulseaudio solves problems for me. It totally might not do that for you, but please don't project.


I <3 pulseaudio as well, glad to hear someone else loves it. All the reasons you mention :)


Google's Android users activate one million new devices each day. This Linux based mobile OS has absolutely crushed Microsoft's mobile software, and has surpassed Apple worldwide.

I have begun to wonder that, given the evident superiority of Linux in mobile, given the lack of GNU within it, and given the central control of Linux mobile development by Google and Amazon, then to what degree has GNU and its development traditions held back Linux desktop adoption?

Thoughts?


GNU is not really relevant, except may be to say that free software philosophy can't build something for the mass market, since it takes money to pay workers to slave away for the benefit of the plebes.

The UI layers (GNOME/KDE/Canonical) may be relevant. Linux desktop doesn't have a giant company funding massive ad campaigns and retail outlets and spit-and-polish and OEM integration, and all that is needed to launch a successful product for the mass market.

Linux desktop is very successful in the developer workstation and server markets, where the users are more tech savvy.


Now you made me curious. How would I be able to switch between different sound cards on the fly without restarting applications if PulseAudio did not exist?


Let's take a common case: I have a motherboard sound device which I use 95% of the time, and I have a USB camera that has a built-in microphone.

When I go to settings in Google Hangout, it asks me which device I want to use. ALSA presents the interfaces. I choose whatever combination I want -- mic from the camera, headphone out through the mobo -- and it works.


How many people ever had more than one sound card in their computer, let alone a need to switch between them ?

http://xkcd.com/619/


Everyone with headphones or headsets using Bluetooth or USB.


I have some USB speakers I'm fond of, and PulseAudio was indeed the only way they ever worked right for me on Linux.


As CPU counts continue to rise and Flash support continues to get dropped from more pltforms, that comic will become more ironic.


I think I have four: Onboard audio, pcie sound card, usb microphone and hdmi audio on video card.


Not sure if serious


The problem with Android is its diverging from the Linux desktop path (i.e. X11->Wayland). Being completely incompatible put Android apart and it arguably can't be called a Linux desktop at all. The split it caused in the mobile devices (with no drivers available except for Android) is a horror, and like Aaron Seigo called it - Android is the best friend and the worst enemy.

The real future of the Linux desktop and mobile is Wayland, and Android it seems will stay apart forever, going its own path.


"the real future"...

The real future is what users like best, not what developers think that users might like. I personally don't like the Wayland approach and I think that competition is fine. Let the "market" decide what's best.


Users like what's comfortable and functional to use. But those who create it - are developers. Regular mobile Linux has hard time presenting existing software solutions to the user because of the hardware barrier reason. And not because they are in any way inferior to the Android user experience (if anything - they are superior).

I.e. there are practically no manufacturers which provide closed Linux drivers or open specs (so open source drivers could be created). They mostly only care to provide Android drivers. Hopefully Jolla and Plasma Active will help to break through somehow.

This situation is caused by Android's historic roots, since it started as a proprietary project, and didn't take in account any interests of the Linux community. The fact that it was open sourced later didn't really change anything - it's de facto completely separate from the Linux desktop as well as from the mobile Linux varieties which share the effort with the desktop distros. Wayland was created with collaboration in mind. Android architecture - was not.


I'm curious - why cant a desktop version of Android be possible ? I mean, OSX is moving to a world where an app developer (optimistically) can code once and compile to both desktop and mobile targets. Why cant Android be able to drive that in Linux.

I would like to think that if Android was available on both desktops and mobiles, we would have been happily playing Dark Souls using Steam-on-Linux by now.


It's not that it "can't". It's just that would be bad for Linux even more. Do you want the sick situation with mobile drivers to spread to desktop as well? No, thanks.

Also, Android is rather narrow in its capabilities comparing to normative Linux. No need to cripple the desktop like that.

It's going to be other way around - normative Linux will start competing with Android in the mobile sphere.


> The split it caused in the mobile devices (with no drivers available except for Android) is a horror, and like Aaron Seigo called it - Android is the best friend and the worst enemy.

These mobile devices could grow some HDMI and USB ports in a few iterations and becoming stationary.

Tada - there's your Linux desktop.


That's not the point (many of the mobile devices already have a USB and HDMI). The point is - Android incompatible architecture put it apart from the rest of the Linux world. And it's a horror to squeeze X.org or Wayland drivers (as well as other Linux drivers) from the mobile manufacturers, when they say "we are too busy with Android".


I have a Transformer Infinity with the dock that I use as a netbook that I can plug into tvs and stream movies off of. The most recent iterations of the Nexus Q et al have mini usb and mini hdmi ports and can effectively do the same thing as well. Android devices have been at the point of being desktop replacements starting this year, and mainly due to the Transformer tablets, but any bluetooth keyboard + hdmi complient device can serve the same function given a beefy enough processor.


Actually I think Canonical is making a terrific job. IMO it has the best out of the box experience of all desktop OSes on the market.


A link to the actual discussion is probably better, it includes some back and forth between Cox and Icaza as well as other comments: https://plus.google.com/115250422803614415116/posts/hMT5kW8L...


"probably"? Very definitely better than an article that goes like a story written by a 10yo: A said this and then B said that and then A responded ...


I agree, maybe OP should have just posted this thread instead. After reading this I'm wondering if Miguel de Icaza is not in fact equating Gnome to Linux Desktop. One may be dying, doesn't have to mean that the other will follow suit.


The Linux desktop has been wildly successful! Hobbyists have been using it since the 1990s. Yahoo! was built using the linux desktop. Much of Facebook was, too. People in web dev and tech support use it as their workstation OS.

The Linux desktop has failed to become a mainstream 'consumer' OS. Meritocracy has nothing to do with it. Microsoft's monopoly is just unbreakable here. OS X is less than 10% of the market. Linux less than 1%. I believe the Linux desktop is usable in the retail space, because I installed gOS on a guest computer and several novices (including my mom) used it for email, web, music and word processing.

My linux desktop progression has been X11R6 -> KDE -> Mandrake -> Ubuntu gnome -> Mint -> latest Ubuntu HATED IT! -> Xfce crunchbang -> (plan to) Mint Xfce


> The Linux desktop has been wildly successful! Hobbyists have been using it since the 1990s. Yahoo! was built using the linux desktop. Much of Facebook was, too. People in web dev and tech support use it as their workstation OS.

While true, look around at younger companies, OS X has taken over a very large segment of the web dev crowd and I don't see any signs of that changing, especially with the almost mandatory nature of using OSX for iOS development.


This is entirely brand. I went through college where all the cool kids had overpriced shiny aluminum macs and that was the hip thing to own. Having used OSX on laptops in High School and IMacs in college the desktop is awful and I much prefer the diversity of Linux DEs, especially ones like Cinnamon and XFCE.

OSX got its market share off spoiled 20 year old college kids getting their parents to buy them overpriced trendy hardware. I have a 15 year old and 13 year cousin who each have a macbook because its hip.

But besides the hip factor, OSX succeeds because it is the default on Apple products, where Windows succeeds because it is the default on everything else. When average Joe buys a pc, they go to best buy, and every single device has Windows on it, so they use that. They don't know any better and would never fathom installing an OS, and if you show an alternative to them it breaks their minds because it is universe breaking that Windows isn't the computer.


"the desktop is awful and I much prefer the diversity of Linux DEs [...] OSX got its market share off spoiled 20 year old college kids getting their parents to buy them overpriced trendy hardware."

Sigh. Stop stating your opinion as fact. Personally, I use OSX because it's a *nix OS that actually has a coherent UI, and it's paired with the best hardware available right now. Saying that it's popular because it's trendy is wildly misguided- how do you explain all the developers that use it?


Saying that it's popular because it's trendy is wildly misguided- how do you explain all the developers that use it?

Trendy, popular, whatever... The original comment is spot on. Products becomes trendy and popular mostly because of great marketing. The reason you prefer OSX (or almost anything else in your life) is because you've been conditioned to. We make decisions based on a complex cocktail of facts and feelings, most of them external to our own thinking and, therefore, susceptible to clever manipulation.

Every BMW owner buys an "ultimate driving machine" (with an automatic transmission) and most OSX users believe they're enjoying "the best UI/UX" (and obediently squeeze their life into ~500 vertical pixels leaving the rest to the Dock and global menu).

It does not make sense to look for a technical explanation for why "Linux desktop" is not popular. Technology has nothing to do with popularity. Moreover, Linux Desktop is indeed widely popular among people who're pre-conditioned to like things it offers.

Here's an anecdote: I have introduced younger relatives to computers in mid-2000s using only Ubuntu. After years of daily Gnome 2.x use they find both OSX and Windows incredibly "retarded" and hard to use.

Don't kid yourself thinking we're far ahead of Pavlov dogs.


The reason you prefer OSX (or almost anything else in your life) is because you've been conditioned to.

No, it's because I've used the other alternatives and made a rational decision. I tried Ubuntu, it didn't work with the hardware in my laptop, the trackpad was useless and it didn't run a number of programs I require (the Creative Suite, for one). I would happily use Windows, but the hardware Apple creates (in my case, the Air) is largely unparalleled in terms of build quality and weight. So, OSX it is.

These are all measurable, quantifiable things. While I can't disagree that marketing affects people, to suggest that the reason everyone likes something is because of conditioning is a gross generalisation.


This. While it's true many people buy Apple gear because of branding and image, well, that's marketing for you. That's how retail works. As a well seasoned computer guy, I've chosen every single mac I've purchased over the last number of years. There was a time when I wouldn't touch one with a 10 foot pole, and I'm sure a time will come when I won't want a new one.

I've used enough different operating systems in all kinds of configurations over the last 20+ years to decide like a grown-up what I'm going to spend a considerable amount of time using.. not because other people think it's cool, but because it fits my budget and works for me. My recommendation to others is: Try stuff and use what works for you.


This is what Apple does right. Windows and Linux were never a realistic option.

If you needed the form-factor of an Air and Creative Suite there really was no decision. Rational or otherwise. There is certainly no conditioning, playing with Ubuntu was pointless.


The Dock is auto-hideable and can move to the side, and the global menu is 20px tall, leaving 748px even on an old 768px iBook, and fullscreen mode supresses that global menu, which is just a relocation of the in-window menu grapeshot all over all the windows in other OSes. So where is the squeeze?

Most of the people you are lecturing actually use 2 or 3 of Windows/OSX/Linux on a regular basis, and so have informed experience backing their preferences.


I switched to OS X in 2008. Prior to that, I had been using Linux since 1995, and contributed quite a lot to Free Software (Gnome, gtkmm, KDE, Rosegarden). May be I saw one too many Apple add on a billboard. Or may be after trying my mom's macbook, I realized that OS X is pretty cool to use while I just had to google around some poorly written docs to restore my tilt-wheel mouse configuration which a Kubuntu upgrade had broken.


>> The reason you prefer OSX (or almost anything else in your life) is because you've been conditioned to

>> I have introduced younger relatives to computers in mid-2000s using only Ubuntu. After years of daily Gnome 2.x use they find both OSX and Windows incredibly "retarded" and hard to us

You are refuting your own arguments. eg How can your nephews like Linux if they have not had marketing to condition them?


But don't ignore the fact that the Mac rose to prosperty [..] through appealing to a younger generation with access to money a desire to look cool, because they hit that ball out of the park from my experiences, not my opinion

I'm glad it isn't your opinion, because you'd be wrong again. Mac owners, by age, are actually older on average than PC/Windows users.

http://www.metafacts.com/pages/media/tupan06_announce_061129...

As an embedded systems developer, who works in a company filled with EEs using OSX for their personal machines, I can say that what keeps me buying macs is that

* Like Linux, OSX is posix and has a working shell and scripting environment, and

* Like Windows, it has a pixel-perfect window manager and a tastefully designed UI that doesn't make my eyes bleed.


I said the market share came from college kids. I honestly never understand how developers use it, but I can understand smart people find value in it I don't. But don't ignore the fact that the Mac rose to prosperty not through its developer tools but through appealing to a younger generation with access to money a desire to look cool, because they hit that ball out of the park from my experiences, not my opinion.


Mac OSX is paired with limited hardware choice which is far from best. With Linux you can build any system you want up to very high end computers. It's unmatched in flexibility.


It is not entirely brand. For me, I only begrudgingly use a Macintosh laptop because it is better for someone who doesn't want to fiddle, now or in future releases of their distribution. At this point given the length of time I've stuck with it I can probably be considered a Linux-on-Workstation die-hard (I have one at work and at home that I use for most Serious Tasks), and I am pretty fed up to have given up on Ubuntu on laptops: http://news.ycombinator.com/item?id=4451792

And the meat of my complaints have nothing to do with UI, but the endless finicky regressions, typically having to do with drivers and hardware integrations.

Anecdote: my friend and co-worker has an allegedly 'good' piece of Lenovo Thinkpad hardware. He still managed to have a problem on a video driver upgrade on 32-bit (he then switched to 64-bit, where things went better -- did I mention Ubuntu's home page still suggests 32-bit Linux?) and hasn't gotten around to convincing it to talk to an external monitor. He has more patience than me -- mostly out of philosophical stubbornness -- although I suppose we both are stubborn in a way since I'm still buying desktops at every location I work frequently in 2012, and regressed to a Macintosh to fill in the gaps when moving about.

It is a disaster.


What computer less expensive than an iBook or Macbook commonly survives 4 years of school, and is less a disaster than Windows pre-7 (which I guess is old now, wow time flies)?

My MBP is 5 years old and still runs OK. (It could use battery #3 and an SSD replacement, though.)


I actually encountered a Toshiba Satellite circa 2002 a few months ago with a Pentium 4 that was running fine and the HD didn't even have any smart sensor malignities. The thing ran slower than dirt, but worked fine.


I completely disagree within the confines of developers. I'm not really sure what the comments about consumers have to do with this topic.

I used linux for 5 years as a desktop OS, skipped in between windows occasionally. Just recently switched to a mac. It has nothing to do with being hip. OS X is an OS which runs all the *nix tools I need, has a reasonable well thought out UI and above all it __just works__. Its what I wish linux on a desktop could have been but never was and I doubt ever will be.


If by just works you mean that you have to deal with abominations like MacPorts or Homebrew, then yeah, but personally I cannot deal with all that crap, so Ubuntu is the one that just works for me.


I agree. It really is a pain to build OSS on the Mac, and it's not like you'd ever deploy server software to a Mac.


Not such a pain once you figure it out... And for any of us compiling stuff... seriously, are we pretending we don't all use remote linux slices/machines/whatever all over the place?

My laptops are just workstations to do day-to-day life stuff and a portal into my virtual computer world, where I'm working with remote displays, browser windows, command lines all over, and all that jazz. If it's anything remotely server related that needs to be on all the time, I'm likely not running it on my laptop anyway - it'll be on a linux box somewhere.

Or, if I'm on the move and it's really important to get a project done, a local VM on the little macbook.


How is Homebrew any different from than apt? Both are package managers. Neither OS provides all of the tools a dev needs, so regardless of your OS choice, you will using a package manager to acquire your complete toolset. So, why do you recoil at Homebrew, and not apt?


I recoil at Homebrew because it fails to successfully install relatively mainstream software that it claims to support (mysql, in my case). I do not recoil from apt because it regularly and repeatedly both cleanly installs and removes any software it claims to support.


You are an exception to the rule is my point. For every developer that buys a Mac based on taste, there are at least a dozen kids buying them not for tangible reasons but for the Apple brand, at least where I live and through what I have experienced. Different areas might produce different results, but there was only one other guy in my CS track that used a mac for development, but tons of my out of major peers had them as hip facebook machines.


at least where I live and through what I have experienced

Please just stop now. Take untog's advice and stop making arguments based on your anecdotal experience.


I was seduced by OS X for awhile at work. But getting all the GNU stuff running is not easy or complete. I'm sure if you are writing ios apps it is smooth.

I did like being able to open a shell and kill -9 a hung app.

But in the end it just felt slow.


It has been my experience graphic designers use Apple's OS du jour and hackers are split between Linux and OS X.


> Mint -> latest Ubuntu HATED IT! -

Why this step? Most people go in the reverse direction.


The new UI sounded interesting. But after running it the UI felt like it was for a touch tablet and I have a K/M.


Short rant ahead:

Why should Linux be doomed (or killed as wired puts it) just because Windows & MacOS have more idiots using it? For a desktop there are certain design choices to be made. If one wants to target users, then one should use sensible defaults and try to hide complexity. If one wants to target hackers, then one should have a lot of easily accessible APIs, preferable accessible via a CLI. That means exposing complexity. For the two groups consistency in a UI does mean two different things, users want a "intuitive" UI, whatever that means. Hackers want a UI which is based on principles, such that one can deduce the proper operation, even if it is complicated and takes some time to learn in the beginning. The list could go on, but in short: I have different needs on my desktop than the average user. And if Linux would become a sensible desktop OS for the average user, then I would likely move to BSD (or Plan9 or whatever).


Upvoted because it's a rant and a good one too! I agree with your conclusion. Same here although I think that won't happen. Linux is really just a kernel, the whole UI/UX stuff is happening at distribution level and I think there'll always be a distro for the hacker type.


Agree; moreover Linux is only a kernel, so you'll always find a distribution with aims compatible to what you want.


Unfortunately, there is no distribution which matches my "want" exactly. I know, I'll just make my own...


Agreed. I don't even know what this "Linux Desktop" is what most people are talking about here[1]. If I walk around at work or look amongst my friends which use Linux there's no single desktop that is the same. THAT is the Linux desktop IMHO and that is its strength.

[1] Perhaps Ubuntu's default install?


Linux on the desktop isn't doomed at all. People saying the opposite are either naive or have interests to lie about it.

I am absolutely sure that Microsoft and Apple are fighting against desktop Linux any way they can as it takes away market share and any amount of market share is fought hard about these days.

In my opinion Apple and Microsoft favoured (paid off) bloggers/(ex-)developers, and opinion writers do their best to spread FUD and anti Linux propaganda all the time and everywhere. I think that a large part of the Ubuntu Unity FUD for example comes from there.

Look at any discussion remotely about Linux or Ubuntu and you will see random comments about how bad Unity is, even here in this thread. Even if the topic has nothing to do with Unity. It's not a coincidence.

I am a developer using Unity and find it the greatest thing that ever happened to Linux, and the non-developer people I support on their Ubuntu Unity machines just love it, they wouldn't want to change anything.

All I see is that the Linux desktop now is getting really competitive (yes I mean Ubuntu 12.04) and that's why we hear all this FUD now. With "competitive" I mean usability, security, ease of use and style. The market share is low, yes, but it's slowly growing and it will continue to do so.

The Linux desktop being dead is just a laughable idea when looking at the facts.


Why does this topic come up ever other day, and every time people go on tyraids about individual projects and developers being wrong or right, when the answer isn't complicated at all.

Windows is the default. Microsoft got in bed with hardware producers and computer retailors and got Windows on every product on every shelf. For 2 decades you went to buy a PC and you were buying a PC with Windows pre-installed. The extreme majority of PC users and buyers have no comprehension of how to build your own PC, what an OS is, or that they can even use alternative operating systems. At this point, Windows is the computer to a tremendous majority of desktop OS consumers.

OSX is prosperous because it is the Mac default in the same way Windows was prosperous because it was the everything default. The average consumer does not conceptualize or think about how the software is independent of the hardware, what they see and get is what they stick with 99% of the time. The other 1% is Linux / BSD / everything else.

The day of the Linux Desktop is the day a laptop running any flavor of a linux distro sits in Best Buy next to a Windows laptop and costs $50 less because it doesn't have the Windows license fees attached. Everything else is just window dressing of that issue, which is in plainest terms that Apple and Microsoft have a profit motive in getting their OS on your machine, and go out of their ways to see hardware with the OS preinstalled so Joe Shmoe uses it, whereas Linux distributions have no financial incentive or warchest of influence to push Linux as the default. And the default is what matters.


> The day of the Linux Desktop is the day a laptop running any flavor of a linux distro sits in Best Buy next to a Windows laptop and costs $50 less because it doesn't have the Windows license fees attached

Well... It's not Best Buy, but, still, it's something:

http://imgur.com/tzv74


You should read the "story" before commenting. Allow me to copy and paste the response I gave to a similar comment earlier:

"None of this recent discussion is about why the average computer user doesn't use Linux or won't try it -- that topic has long since been beaten to death. It's about why Linux has the inferior desktop experience, and how OSX was able to steal away so many of the developers and users that, in a perfect world, should have been happiest on Linux."


But that's not really sparking a discussion. "Inferior desktop experience" is completely subjective. Personally I think OSX has the worst desktop experience of the three (Windows, Linux and OSX), Windows sits in the middle and Linux is the most joyful one to use. That might not be true for you or for anyone else though and arguing about it is really leading nowhere.

All I know is, that I'm glad to see differences between OSes regarding not only kernels and the way things happen but also regarding UX and interfaces (where Linux is by far the most flexible, because there's more than one desktop environment - a further differentiation) and I don't want them to be the same either.


I bought a $400 Lindows desktop from WalMart last decade. Not many others did.


I've read several of these pieces now post-mortem re the linux desktop. Far from disagreement, I find myself agreeing with everyone's major points.

I think the bottom line is despite how much people want one, there isn't just one reason for its unpopularity, there are dozens. It isn't good enough on multiple fronts--paper cuts abound next to deep slashes. All in the shadow of two compelling choices in modern Windows and OSX.

They have their own annoyances as well, but you can expect drivers, audio to work, no major regressions--nothing you use to just disappear between releases without recourse. I'm thinking of the gnome service control manager workalike and sessions, thanks Ubuntu.


"nothing to just disappear between releases without recourse."

Spaces, "Save As"... :(


Those are annoyances one can work around. They are not APIs anyway (which is what the debate is all about), they are UI design decisions. Plus "Save As" kinda came back (hold Option while browsing the file menu), and Spaces are still here inside Mission Control.

Changing APIs like Gnome/KDE did frequently is something else altogether, especially when you don't have the luxury of a huge user base to force commercial companies to rewrite their code. As for open source application developers, they often just flee and abandon their apps than rewrite them for the third time around some new API.


I'd just like to point out as a KDE developer that we don't go changing our APIs all willy-nilly.

Every major release I can remember (since KDE 2.0) has kept binary compatibility for the libraries (see http://techbase.kde.org/Policies/Binary_Compatibility_Issues...). There have been exceptions for Plasma libraries, but those were announced beforehand.

Many current KDE libraries are evolutions of technology going back to KDE 2 (for better or worse).

Perhaps the biggest change over the whole lifetime has been the shell (going from kdesktop+kicker in KDE 2+3 to Plasma in KDE 4) and going from aRts in KDE 2+3 to Phonon in KDE 4.

Btw, we were highly encouraged to adopt the popular multimedia framework at the time (gstreamer 0.10) instead of writing Phonon. gstreamer will soon be releasing an incompatible 1.0, while in the interim Phonon gained native support for the sound server that become highly popular in the meantime, PulseAudio.

KDE has been doing its part to insulate developers as much as possible from mandatory API changes. It's not as good as the kernel or Microsoft has been able to do, but it's certainly not as bad as GNOME IMHO.

I can say this as an app maintainer too. Going onto the KDE 4.10 development cycle, JuK still has code that uses the "kde3support" library. :(


I'd just like to point out as a KDE user, I had much less problems with the graphics drivers that with Gnome 2.x or Gnome 3.

The thing that make to change from Gnome 2.x to KDE 4 was that Gnome was running at slug speed with VESA driver (ATI drivers not working well in a barebone that was making for the TV), when KDE/KWin was running pretty smooth. Plus in this last years and tested that when the crap ATI drivers introduce some instability to the GUI, KDE at least self-recover alone very quickly and I don't get unusable GUI, like I saw with Gnome shell + ATI privative drivers.

Finally, I love how I can customize my desktop to my needs... It's like the old times when Enlightenment was the cool and beautiful wm, but easy to change anything.

KDE 4 isn't perfect, but I think that it's much better that the crap that Gnome is actually.


I've stopped following KDE like 5 years ago, but I remember constant changes to the multimedia APIs. Also each new QT version (2->3->4, now 5) brings incompatibilities, no?


The multimedia API changed from KDE 2 and 3's aRts to KDE 4's Phonon. The API is still Phonon, and this will be true in KDE 5 as well.

There were other application-specific APIs that were used for things that were beyond the capability of aRts and Phonon (e.g. Juk could use gstreamer directly in KDE 3) but the KDE APIs themselves were pretty stable.

It's true that a Qt change in binary incompatible and involves at least some source porting effort, but honestly those have been fairly infrequent, and only one of those transitions (3->4) involved more than minor source changes. Qt 4 was released at the end of 2005 (around the time of Linux 2.6.15) and is still the "current" version, though the Qt 5 release is around the corner.

In just the KDE 3 timeframe to now, on the other hand, there has been gstreamer 0.6, 0.8, 0.10, soon 1.0, PulseAudio, ConsoleKit, HAL, HAL's replacement (DeviceKit), DeviceKit's replacement (udev and friends, upower, udisks, etc.), PolicyKit, PolicyKit's replacement ("polkit"), NetworkManager 0.8, NetworkManager 0.9.

Needless to say, trying to keep in sync with all of that has not been fun. :-/


Carbon -> Cocoa was a massive change. So was PPC -> x86.

There are also massive API compatibility issues even between those two enormous shifts -- witness the sudden breakage of all SDL games due to raw surface access in CoreGraphics moving to IOSurface, just to name one.

The difference is, I think, that Apple has conditioned developers to expect massive breakage, and the OS X app industry has structural differences from the Linux desktop app community that make it more likely for apps to get fixed quickly.


Yes the facts really aren't with Miguel on the API breakage claim.


>Carbon -> Cocoa was a massive change. So was PPC -> x86.

1) Both were necessary. Carbon was a stop-gap port of the old Mac way and incompatible with the OS X design, and the x86 transition had to be made because the PPC mobile roadmap was late (and bad). It was not a "let's break API compatibility because we came with a cool new idea" or "because the old maintainer got bored and the new one has different ideas", which often happens in Gnome/Linux land.

2) Not to mention that the Carbon->Cocoa took like 7+ years to be made full (Carbon apps actually still run on modern OS X), and Apple went to the effort to provide a full bundled compatibility layer for PPC for many years (Rosetta), plus "fat binaries" to make the switch easier. I.e nothing like the "let's break stuff to make them better and let them deal with it" attitude in the Gnome world.

3) Also: Apple remade the old stuff ITSELF each time, with paid developers working on the OS X, bundled userland and Apple pro apps to port them, whereas Gnome changes made lots of developers walk away from the free, OS, Gnome apps.

4) Plus: Apple had a lot of market share (compared to Linux) to force companies like Adobe, Microsoft et al, to port their stuff along eventually. Gnome did not have that benefit.

So, not at all the same, and no, it's far more than "Apple conditioning developers to expect change".


The point of desktop linux is that it keeps MS and Apple honest. If they push things too far away from where the common good lies (ie. toward the short term interests of their stock holders) then expect to see a huge resurgence of desktop linux, supported by OEMs who will suddenly be pushing cheaper faster linux machines with tested specs as a way of winning back territory. Until then, don't, because it will be down to a volunteer community to do heroic work to other peoples schedules and basically that's always going to be hard.

Server side stuff has a lot of support from big shops that need their iron to work well - so that can go along just fine as is.

My point is that because it exists and is nearly viable it is a success and does what we need it to, even if using it can be nightmarish.


Linux will never work on the desktop because it is made FOR neckbeards by neckbeards.

It is a community driven effort, which means the community pushes its development and design. Who do you see using the linux desktop? The exact type of people who design and develop it. Power users and developers.

They are a stubborn bunch. Even if Unity/Gnome 3/etc were more usable desktops, the majority of users would hate it. Because something like a newbie usable desktop just doesn't fit who's designing the desktop.

Something else will have to shove it into the wider market where it can be judged by those standards, not by what set of power user keyboard short cuts it supports.


Oh don't tell people how they feel about things! There are hordes of non-programmers (or their likes) using Ubuntu, even if they could choose Windows or Mac. Neither my SO or brother can code and they have rarely if ever been in a terminal, but are both very happy with Ubuntu.


I agree with you, my SO is fine with it also. I'm talking about the design process. I really like the direction that Ubuntu is going. It's being designed for the masses, but the pressure on ubuntu is from the power user crowd.

I think it will go nowhere because the current linux user base will be holding it back. Gnome3 and Unity are fighting this tide also.

I don't think the state of the linux desktop should be, "Copy XP and put a real shell behind it".


Ubuntu's mission statement from the start was linux for normal users. They always said that they were trying to provide a desktop for regular users, and their argument was that there are already heaps of distros out there for power users. It's not a new direction they've taken, it's just when they had the opportunity to simplify, a lot of users didn't understand what their core mission was.

Besides, if you want a Power User version of Ubuntu, just use Debian.


Just based on the Linus quotes, I think he's right.

The kernel did what it was supposed to do back in 1991.

What is the "desktop" supposed to do?

There will never be unanimous agreement on that - every user will have different needs and preferences - and so desktop developers live in constant denial, believing they are the only ones whose preferences are relevant - i.e. they know what's best for users - and blaming others for their own failings.

I'm not a Linux fan, but I give Torvalds' +1 for his response (and knowing the value of "not breaking stuff").


The problem with Linux is much simpler IMHO. No I don't mean the actual causes, but the root cause: Linus is a technical leader, so the kernel is a good product. He is not a "product leader" from the point of view of the final Desktop experience, so he took no responsibility for that (he actually blamed the UX of Linux many times in the past).

No one emerged as the "Linux Desktop Guru", taking full responsibility of a distribution completely targeted at fixing the UX, selecting a set of defaults to actually create a development platform. Fixing the problem of users supporting in a perfect way at least a subset of computer systems (targeting macs makes sense for instance).

So the Linux Desktop was the sum of a multitude of different tries, that alone sometimes were interesting, but not enough at all to define the whole OS experience from the point of view of the user.


Is it possible for the community to allow a desktop guru? Early on in GNOME they had a number of popular graphics designer types making all sorts of really exotic looking UIs, a number of conflicts ended that. (enlightenment was the official window manager for a while there)

Then the eazel guys showed up and made Nautilus, if anybody could have been the guru, it was Andy. I honestly don't know what appended there. Nautilus is still around, it's under gone some substantial changes along the way.

Seems to me it's a very difficult community problem to solve. Look at replacing sysv init, it's a very technical issue, it doesn't matter to a lot of users, there are some technical goals you can bring up to keep the discussion rooted in fact over opinion and there is tremendous disagreement about it. The community hasn't arrived at any conclusion exactly. Why should something as esoteric as a ui be easier to arrive at consensus on? Everyone has an opinion on ui.

There is another aspect to it, it's distribution fragmentation. How do you set an ip address in Ubuntu? How about Fedora? Do it via UI and command line. They are different and that's an easy one. The mechanics are so different, it's not a desktop problem, it's a distribution problem, which sort of makes KDE and GNOME toolkits rather than desktops themselves.


Well what I mean is, imagine if Ubuntu at some point started to be opinionated about what a Desktop Linux should look like more and more, with some people inside fully responsible to build the UX in every detail, how binary packages should be trivial to download and install, how drivers should be trivial to install (for a subset of the hardware that is fully supported at least), and so forth.

A distribution that tried to provide a full experience, instead of being, indeed, a Software Distribution. In that case you can have people in charge of deciding all the details. Payed people to do the best work possible.


Well that is what Ubuntu is doing now. So we don't really have to imagine it. Maybe we could imagine it being more complete, or more successful.


"We'll force Corba/.NET down your throats"... ???

Firstly, Corba was there from the start, and then it was removed.

Secondly, there is no .NET dependency that must be satisified to create GNOME applications.

While there are a few interesting points Tovalds made, the problem isn't really a technical one in the code, but rather a lack of focus on what really matters.


> there is no .NET dependency that must be satisified to create GNOME applications.

No, but many apps that are part of the "official" Gnome desktop run on Mono.


Like?


Banshee, F-Spot, Tomboy.


There might also be the fact that OS X and Windows are more than decent desktop OSes and competing with them is not easy.


I think it is more about people being exposed to Windows and Macintoshes. If my non-technical parents had Ubuntu at their jobs and their computer were shipped with it, do you think they would even dream about searching for a new OS to learn?


The same argument works for any of those systems.. not just linux.

The reality is many people try many systems and settle on one they like.


The discipline of being posix compliant forced Linux to innovate across the defined boundary. I recall Chris Click's talk about JVM and its definition of garbage collection APIs, class libraries etc each enabled innovation to be done independently of one another.

There is no such story for kernel-mode drivers. The problem was so dire with network card drivers at one time that there was a project that enabled windows drivers to work on Linus. Think about it - if Linux standardized on a stable external API, then there'd be no arguments about how to implement it.

Linus kept talking about userland vs kernel, and did not address the issue Miguel talked about. Kernels need performant external interfaces too. Drawing a line around what is being innovated and changed will only help Linux the way posix did.

On another tangent, when Microsoft made IE6 the stable browser platform, it gave a clear goal for Mozilla. It became a reference implementation, allowing Mozilla to emulate all the layout behaviors and eventually carving out a nice marketshare for itself.


ndiswrapper is/was more a function of the hardware vendors not supporting Linux/supporting Linux very badly. Go look at some of the early Linux Realtek drivers for example.



None of this recent discussion is about why the average computer user doesn't use Linux or won't try it -- that topic has long since been beaten to death. It's about why Linux has the inferior desktop experience, and how OSX was able to steal away so many of the developers and users that, in a perfect world, should have been happiest on Linux.


It has an inferior experience because you can't profit from making it better.


[deleted]


I consider the linux desktop to be quite usable. So many window managers exist... a spectrum from things like schemewm to unity.

Unity is a good one to focus on b/c it's intended for a mass-market audience. So far I've noticed a few things that illustrate the difference in money/professionalism of Canonical compared to Apple -- Poor 64 bit support, support for the 2D version is gone in favor of a slower 3D version forced on all users, many unproven UX metaphors, and a very bold and experimental (but still flawed) attempt to make it work on tablets.

Apple on the other hand is far less ambitious and the window manager experience is smooth and iterations between versions are quite minor and clearly reflect lots of thought and planning.

I'm rooting for Linux on the Desktop. I'd be using it today if there were a manufacturer other than Apple whose laptop I felt like carrying around.

Brief tangent: I'd be curious to try a ubuntu fork that was intended only for the Macbook Air with all the customization and optimization for that hardware ready to go after install.


> support for the 2D version is gone in favor of a slower 3D version forced on all users

Remember when they updated iOS on an early generation iPhone model and it became unusuably slow?

But in general they simply won't update older hardware. You have that choice with Ubuntu, too.


> Brief tangent: I'd be curious to try a ubuntu fork that was intended only for the Macbook Air with all the customization and optimization for that hardware ready to go after install.

My guess is that you'd really only need to replace a few packages, rather than have an 'Ubuntu fork'.


> Brief tangent: I'd be curious to try a ubuntu fork that was intended only for the Macbook Air with all the customization and optimization for that hardware ready to go after install.

That's what the XPS 13 Ultrabook aims to be: http://liliputing.com/2012/07/dells-to-sell-ubuntu-ultrabook...


Dell is a non option for me at this point. The company has tried to cash in on its previous reputation and now sells garbage with the Dell logo on it. I have already been burned and will never buy another Dell product.

Dell used to make decent laptops but hasn't since about 2000.


Talking with a couple of friends who are exposed to Dell via work, it seems that Dell in recent years has improved in quality. Not in terms of a high quality PC, but they at least now seem appropriate for their pricepoint and aren't utter garbage.


I have a Inspiron 1720 from 2007 and it's still my daily workstation. An amazing machine. The only problem it has is a fading backlight.


Community democracy is not good at such things. I believe a well-funded tyrant with good taste could create a viable desktop though.

You'd need a Jobs, a Wozniac or two and team, working for the good of humanity... there's probably not much money to be made at this stage of the game, and why it is so unlikely to happen.


"I believe a well-funded tyrant with good taste could create a viable desktop though."

The counter-argument, of course, is Mark Shuttleworth and Unity. :-)


"With good taste" is an important part of the equation. Not trying to force a tablet interface onto dual screen workstations. No, they'll have to do a lot better. Experimentation is fine too, but do it in an unstable branch.


It seems to me that Ubuntu won't really be stable until Canonical seriously tries to monetize and market it in cooperation with one or more hardware vendors for wide availability.

Until then, all Ubuntu users are essentially beta testers.


Is he behind all the UI decisions? Does he have the final word on everything there?


There's plenty of finger-pointing when commercial software enterprises miss their goals too; it just usually happens behind closed doors.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: