Don't take this the wrong way, but the article's data doesn't say what they want it to. While it is true that Linux can be stripped down farther than Windows (and thus do raw calculations faster), that isn't necessarily the only reason that supercomputers use Linux.
For example, Windows licenses for supercomputers are very expensive, as historically they're server editions priced by CPU count. This is a drop in the bucket where supercomputers are concerned, but this would mean that Windows would need to be better in some other way: even if it's not much, people don't generally like to pay for nothing. The difference could almost mean another CPU in some cases.
Also, while Windows has much better consumer hardware support, this aspect of compatibility is not relevant to supercomputers. Linux has much better support for different hardware architectures (the so-called "Playstation 3 processor" can run Linux, but not Windows). This will typically mean additional cost savings for a Linux setup, as they won't need to pay the x86 premium (in terms of $$/processing power).
I'd place tunability at a distant second to hardware costs, and software costs at a distant third. I'm just curious why more of them don't go with BSD.
Linux installs on supercomputers often aren't "stripped down" when compared to a typical server install. Instead, they're customized, often at the kernel level. A Linux install on a PS3 is mostly just a stock PowerPC kernel and user-land binaries, with some kernel patches for communication with the Cell's SPEs. Such customization is infeasible in Windows. (It's possible, just really hard. You could try to get Microsoft to do it, but that's unlikely. Microsoft does allow certain groups to get access to the source code, but that's not easy. And even with access, I assume it's hard to make changes since, as proprietary code, there's less global knowledge and resources to help you.)
That is the key reason why Linux is used instead of Windows for supercomputers: customization.
As for Linux versus BSD, the short answer is Linux is more popular. The long answer - why Linux is more popular - I don't know.
I think "users" of supercomputers, i.e, application developers, would on average greatly prefer Linux to BSD. Ironically, it's probably the same reason that consumer devices come with Windows.
The Windows' fan club likes to point out that Windows is far more popular than Linux. The reason for that has nothing to do with quality and everything to do with monopoly.
This is pretty flawed reasoning. Sure, Linux is much faster than Windows when it comes to supercomputing and raw processing power, but that doesn't mean that it is a better operating system for everyday users.
Linux doesn't have a standardized or intuitive GUI out of the box and creating a productive desktop environment that suits the user requires time and effort. Other operating system provide much more productive human interfaces which enhances productivity far more than raw processing power ever will.
You raise the point that should come to everyone's mind as soon as they read that article. Ability to perform well on supercomputers and ability to perform well on desktops or laptops are not necessarily related.
As for Linux not having a standard GUI, "Linux" and Windows are not really directly comparable. I could say that win32 doesn't have a consistent GUI, since it's been used on multiple Windows operating systems. Linux is a building block, not the whole package. Windows is a whole package. Let's compare apples and apples.
Um. Have you ever tried Ubuntu? The "standardized gui" and "desktop environment" are a non-issue these days. The list of non-monopoly reasons for Microsoft dominance is growing thin.
Though, I'll go ahead and contradict myself by noting that manufacturers such as Dell are today shipping Linux with some PCs.
May be ubuntu has "standardized gui"(arguably they too have kubuntu, edubuntu..), not Linux. When I want create an application for Linux, I am still baffled with the wide variety of distros and their libs. To start with - (.deb or .rpm etc), (gtk or qt or etc)
Yah and when I go to windows I am baffled, (exe or zip), (win32, dotnet, java, qt, mono, com, python, wxwidgets...., visual studio look-n-feel, explorer look-n-feel, Office look-n-feel)
I run Ubuntu daily. I love Linux, and I want to see it improve. Linux dominates in the area of raw performance, but that isn't the main thing that matters to an end user.
When I sit down at a Mac, I know exactly how to do everything because all the tools and utilities are organized in a way that I understand. When I sit down at someone's customized Linux machine, it isn't always going to be easy to sit down and actually get work done.
> When I sit down at a Mac, I know exactly how to do everything because all the tools and utilities are organized in a way that I understand.
I do not find this to be the case. (I run both Mac and Linux using synergy).
I have found that I typically have to customize a Mac desktop before it is usable to me. About the same amount that I need to customise a Linux desktop.
On Windows I get frustrated and irritated because things that ought to be simple are hidden away in control panels that use non-standard terminology (ie. "internet zone").
Most usability differences between OS's are matters of familiarity, there are only a few examples where minimising the cognitive load was a design goal for a piece of software. And in the case of a general purpose computer that is not an interesting goal, since a computer is supposed to become whatever you tell it to become. In other words the cognitive load of a general purpose computer is infinite, because you can always send it more instructions... The trick is to carve durable bits of functionality out of that infinity.
> Other operating system provide much more productive human interfaces which enhances productivity far more than raw processing power ever will. (emphasis added)
I disagree with that. A highly customized Linux (or BSD) interface can be much more productive than other operating systems.
Yes, you're right that sitting down at a random Mac (Aqua) or Windows box is easier to pick up and get going, but that does not rule out that for daily use a tuned Unix desktop environment can be much more productive than the others would ever be.
For someone who understand usability and interface design, the ability to customize will enhance productivity. Unfortunately, most users don't understand this, so the ability to customize generally leads to clutter (see myspace).
When I sit down at someone's customized Linux machine, ... I just open a nice terminal emulator or a virtual tty and get the work done (i can do it even in osx)
I think Macs are quite counterintuitive for people from the Windows world. I jump between:
- OSX (my home box for Lightspeed/iTunes/Squeezecenter)
- KDE3 (my home programming box, and my work desktop until I started using Awesome a few weeks ago)
- Windows XP (my work laptop)
I feel perfectly comfortable jumping between Windows and KDE3. They're basically the same logic, the same way of thinking. OSX, which I'm posting from right now, is entirely different and extremely frustrating for me. To switch to an app, I click on the same icon I used to launch it? The menu bar associated with a window lives at the top of the screen? When I focus different windows, the weird detached menu bar at the top of the screen magically changes? GUI conventions like these don't make any intrinsic sense. You're either used to them or you're not. People who are used to Windows will find the KDE3 conventions quite familiar, whereas OSX will be quite alien.
Customization is a different issue. "End users" aren't likely to sit down at the desk of a customization-mad hacker. They might inadvertently screw up their own desktops, but they can do that with Windows anyway. "My entire screen is gray!" "Where did all my files go? They were here yesterday!" Except with complete computer beginners (a vanishing species) it's a relatively minor problem.
One toolbar that's always in the same place and always fully visible rather than cropped down because the window is to small. One icon you click to use app X regardless of whether it's already open or not.
This is actually a much simpler and more logical way of doing things than the Windows way, especially for a new user. The Windows way is far more complex, requires more learning to understand, and more things to remember. You just don't think of these things as extra steps because they've become so ingrained and you don't realize you're behavior is actually quite complex.
I gave up on using Linux as my desktop, my family just didn't get it, and it was a pain to configure on a laptop I had. The main problem back then was that some hardware manufacturers would not produce drivers for Linux because it wasn't popular, or because of licensing issues. Has this changed much? I do think the GUI was intuitive, the problem I saw at home was that some people drove automatic shift cars their whole life, and they had a problem switching to stick shifts.
My wife has used Ubuntu on her Dell laptop for over a year now (since Gutsy was released), and she can't figure out bittorrent on Mac, Windows, or Gnome, but using Nautilus for sftp doesn't phase her. I think the single biggest drop in user comfort occurs whenever one application launches another application. Other anecdotes come from finding computers in hospitals with bad Windows images. I load an Ubuntu CD and observe. Docs come in, and use Firefox. No, they can't get their specialized clinical apps, but they have no problem using the same app in a different OS. Similarly, they almost universally either use Macs or are acutely interested in making a Mac their next laptop (I heard once that 40% of docs use Mac).
I think the idea that everyone will use Linux is not so distant a reality as a lot of people think. I'm not in the IT industry. I'm a military doctor. People know about Linux, and they're intrigued. Now. Right now. And more are learning every day.
The author suggests that popularity should be related to raw processing power. If his premise was true, Beowulf clusters would be more popular than Nintendo DS's.
"The Windows' fan club likes to point out that Windows is far more popular than Linux. The reason for that has nothing to do with quality and everything to do with monopoly. Nothing shows that better than the semi-annual TOP500 list of the world's most powerful supercomputers."
Quite. It's like saying the Honda Civic is no good because the Terex Titan (world's largest truck) is so much more powerful. Mind you, I have heard exactly that argument from protectionist zealots.
Perhaps the most misleading title of an article I've ever seen.
"It doesn't get any faster than Linux" would get my vote - on first glance, I thought some caveat had been found in the kernel, meaning it wouldn't go "any faster" than it presently does.
I am sure it used to have another name, though? Also, really the same version as in the PS3, or the "pro" version? (Ah, now I remember: "Cell" was the name).
I don't remember where I read this, but in respect to the topic raised in this article I remember someone criticized lots of the big players (IBM) for focusing too much on server technology and things which would make Linux scale rather than things which would be useful for desktop users.
As for the article being junk, I see lots of other posters have addressed that already.
For example, Windows licenses for supercomputers are very expensive, as historically they're server editions priced by CPU count. This is a drop in the bucket where supercomputers are concerned, but this would mean that Windows would need to be better in some other way: even if it's not much, people don't generally like to pay for nothing. The difference could almost mean another CPU in some cases.
Also, while Windows has much better consumer hardware support, this aspect of compatibility is not relevant to supercomputers. Linux has much better support for different hardware architectures (the so-called "Playstation 3 processor" can run Linux, but not Windows). This will typically mean additional cost savings for a Linux setup, as they won't need to pay the x86 premium (in terms of $$/processing power).
I'd place tunability at a distant second to hardware costs, and software costs at a distant third. I'm just curious why more of them don't go with BSD.