In 200 6, Greg Kroah Hartman, a well known Linux Kernel expert, as able to declare that the following statement:
"Plug and Play in Linux is Still not at the Windows Level"
was not only incorrect, but that this statement:
"Linux supports more devices "out of the box" than any other operating system ever has"
If this is true, than why is it that people don't know this?
I dunno. I guess people are just dumb. I mean, recent polls showed that a very large percentage of Americans believe in fairies. Or was it witches. Whatever.
Instead of focusing on dumb things people believe, let's focus on good things people should know. About Linux, of course.
Now, I will state again as I do now and then that I am not a high level expert on all areas of operating systems or computers. I wrote my first computer program in 1966 at the age of 8. By the time I was 10, I was sort of able to write computer programs that worked, but that were not too sophisticated. I have diligently remained at that level (10-year old) ever since. I know everything about 1% of computer technology, nothing about 65%, and bits and pieces of the rest. I have a shelf of O'Reilly books and Idiot's guides. And I know how to use them.
But I digress. I just wanted to let you know about the Linux Driver Project.
This is a group of just under, or over, 300 developers and a dozen or so project managers who develop and maintain Linux kernel drivers. A few posts back, I mentioned that one of the great things about Linux was that the API was not blindly maintained in such a way that old functions were retained forever .... which means two things:
1) When Linux runs, it is not full of a bunch of crap that no one, or hardly anyone, ever uses; but
2) "Downstream" development (development of code ... programs and stuff) that relied on that API cpuld break when the API changed.
I took a lot of heat for saying that this is a good idea. I was partly basing my thinking that this was a good idea on my reading of a presentation by Greg Kroah-Hartman, author of Linux Device Drivers, 3rd Edition, and Linux Kernel in a Nutshell (In a Nutshell (O'Reilly)) (In a Nutshell (O'Reilly)). I still think this idea ... that maintaining the API in this way ... is right. Not maintaining a stable API is probably more of a problem for proprietary developers who can't rely on a few hundred crazy programmers to pull their nuts out of the fire, so perhaps it can be accurately stated that for the "Windows/Proprietary" world a stable API is an asset, but for the "Linux/OpenSource" world, a dynamically changing API is the way to go.
But again, I digress.
Greg is the manager/founder/guy in charge of the Linux Driver Project. He wrote in his blog:
Way back in January, I announced a program to write Linux drivers for companies for free. When I did that, I never expected the response to be as large as it was.
It turns out that there were two large groups of people who responded to the announcement, companies wanting drivers, and developers wanting to help out.
[source]
This is an interesting philosophy, isn't it? Greg was faced, actually, with a too-difficult task because the project grew so large and so fast. Fortunately, he works for Novell, and for reasons of their own, Novell has dumped Greg's entire FTE into the project ... he now works full time on the Linux Driver Project.
The Linux Driver Project is located here.
I have an HP laserjet printer. It's a somewhat older model. I hooked it to my windows machine, and the plug and play thingie did not work with XP and this printer. It took me all afternoon to get the printer to work. However, now I get popups and strange notices whenever the Windows machine is turned on, regarding the printer and the driver, and I've also got googolbytes of HP software clogging up the too-small hard drive.
So I yanked the cord on that printer and put it on my Ubuntu Linux box. I installed the printer by clicking on a thing that said "Add Printer" ... and the driver for this printer was suggested, I chose it, and in something like a minute or less it was installed.
And I have none of that dumb HP software.
You may ask, why did I even try to hook this computer up to a Windows machine? For two reasons. 1) The windows machine was sitting there anyway and 2) I have another printer, an all-in-one, and a strange el-cheapo scanner and I figured, because I'm a dummy, that between all three devices there would more likely be drivers for them for Windows than for Linux, and that I would use the Windows machine as a hub for this software.
However, the Windows machine keeps falling off of my network (my Linux machines do not). Also, when I hook the scanner up to the XP machine, I have roughly similar results as for the HP printer ... I can get it to work but all sorts of strange things have to happen first.
When I hook the scanner up to the Linux machine, nothing happens. There does not even seem to be a way to install it...
... but when I find the "acquire" command on a piece of software (say, Gimp), and select the subcommand that should go to a scanner, the scanner is there. It did not need to be installed, or at least, not by me. It just hapened.
Linux is better.
- Log in to post comments
You need to write a series of posts for computeridiots like me on how and where to start using Linux. What machine to buy, how to plug it into the wall, how to switch it on, step by little step.
Coturnix, the simple outline of such a guide;
1) Get any machine.
2) Get any Linux boot disk (I recommend Ubuntu for ease of use, or one of the versions made for first time users).
3) Install Linux after playing with the live disk to check you like it.
4) Play with Linux till you break it.
5) Re-install and repeat till you know it well enough to not break it anymore.
I'm in step 5 myself, although it didn't help that the new box I bought clean had a CPU problem and has to be sent to base. It might be better for you to get a box with Windows or something installed and do a dual boot (I did that, but with a trial version of XP64, not a pre-installed, licensed version). Such systems go through more testing before shipping, so you will likely avoid such problems.
As for help, I find the Ubuntu Forums infinitely useful.
Coturnix, if you want any specific help, depending on your technological understand, please let me know how I can help. I am always willing to get someone on Linux and away from Windoze. If you need help getting an ISO file to burn or installing, or even dual booting let me know.
Hmmm, which makes me think I should write a post on doing all those things on my blog at some point.
Greg, if you're going to admit that you're not the sort of person who actually writes and maintains open-source kernel code for a living, please don't try to tell those who are that stable APIs don't matter. I am such a person, and I can say that unstable open-source APIs are one of the more vexing problems we have to deal with. In fact, dealing with it is a significant part of how it's possible to make money with open source, because availability and expertise aren't the same thing and people pay for the latter even when the former is a given.
A lot of problems only occur on certain platforms or configurations, so if your API change causes such a problem then you can't just blindly rely on a few hundred crazy programmers to pull your nuts out of the fire. Even for a platform or device that many people have, there might be a mere handful of people out there who actually understand it and the subsystem you just changed well enough to reconcile the two, and they might well have other things to do. Every moment they spend catching up with your change is a moment they don't spend advancing their project in other ways. When there are not two but many pieces of code involved, all changing, there might be nobody who can reconcile them all before they're outdated. For example, my last company used a half-dozen pieces of open-source software including Linux. Despite the fact that we had the source for all of them, nobody understood the Linux kernel and the Fibre Channel drivers and the InfiniBand drivers (many) and so on enough to make major changes to all of them, so the web of "version X of this depends on version Y of that" became a major impediment to progress. What made it really annoying was that a lot of the changes were just unnecessary "not invented here" rewrites of perfectly adequate code, with almost no gain to offset the cost.
This changing-API treadmill hurts not only individual projects or companies, but the entire open-source ecosystem as well when everybody's busy keeping up with somebody else's ill-advised changes. That's not to say that old APIs must be retained forever, but saying that they can just be discarded without a second thought is just as wrong. It's a case-by-case decision, weighing the cost of maintaining the old API vs. the disruption to one's fellow programmers of removing it. Windows perhaps tends to err on the side of carrying the load, while Linux perhaps sheds it too quickly, but there's no One Right Way that can be stated in a mere sentence or two.
Please let us know when (if) Linux gets to the Mac level.
Uh, what about those of us who want to get something, like, you know, accomplished?
Well, of course I would say that Linux had gotten to and surpassed Mac levels a long time ago, and, of course, I would also point out that OSX is a Un*x-based system at heart, albeit one that does not have those wonderful GNU utilities and all the cool bells and whistles preinstalled when it comes to you (at least, the ones at my work didn't). And where would I be without those? Certainly, unaccomplished. In any case, one does not have to "play with Linux till it breaks". What do you need to do? OpenOffice and KOffice, GIMP, GNUmeric, scribus et al work from the box; and if you want to be esoteric, various compilers and editors abound. I've never had a problem accomplishing anything, and then playing with it (and occasionally breaking it!) in my spare time...;)
Paul: I love your step 4 to 5 cycle. That is exactly right!
Jeff: I accept your commentary as a very valid set of points worthy of debate within among Linux hackers. But you have fallen into a fallacy: My self-humbling comments were serious overstatements. I am more than capable of understanding the basic problem, of passing on opinion and information, and adding my voice to the discussion.
I agree with you that there have to be problems of the type you mention regarding changing the API. There are also problems with stable APIs. I don't see how this can ever be anything other than a tradeoff situation.
Also, this seems to be for the most part an "academic" debate, as most Linux users are getting more done and having fewer problems than most Windows users, and the difference seems to become greater with every iteration of each system.
Pierce: Mac's run on "linux" (with a small 'l') ... bsd, a *nix system. So your question needs to be reversed. Also, you don't need to follow Paul's "use it 'till you break it" method. This cycle of using and breaking applies to all systems. But you certainly have the alternative of actually using it and not breaking it.
It may be academic for you, Greg, but it's an important all-day-every-day issue for thousands of developers such as myself. If the attitude were to become widespread that maintaining old APIs is more of an impediment than a benefit, with Linux vs. Windows held up as the "proof" of that, then not only will developers' lives become more difficult but users will suffer as well. A new feature isn't very useful if you can only get it by upgrading your kernel and everybody's afraid to do that because they've learned that such upgrades will break other stuff. That's the Vista mistake, and Linux shouldn't imitate it. When enough people stop upgrading, stagnation sets in; before long "getting more done and having fewer problems" would no longer be true.
The claim that "stable" APIs are better for proprietary software and "dynamic" APIs (interesting choice of not-antonyms BTW) are better for open-source software doesn't hold water. They're orthogonal issues. The "to break or not to break" debate occurs in both camps, and I know because I've been in both. If there's a pattern to how such decisions are made, it has to do with the size of the user base for old vs. new, not with the development or licensing model. Linux is not to any degree better because its developers are more willing to sacrifice backward compatibility for the sake of attracting new users. That willingness, rather, is an after-the-fact reflection of where Linux stands on the adoption curve. As it continues to climb that curve, the "breakage is good" attitude - see, I can do framing too - can only hurt that for which you advocate.
I have written some time ago this post,
http://www.go2linux.org/why-linux-is-easier-than-windows
hope you may find it interesting.
Guillermo.
Jeff,
You may have some good points, but I think it is also possible that we are talking cross purposes here. Please have a look at this and I'd love to know what you think.
Not to divert a thread intended for deep geekery, and with thanks for the suggestions already made: can anyone here recommend a URL or two for someone without much spare time and with a beginner's interest in exploring 'nix from a Mac starting point (and PPC hardware)?
... (crickets chirping) ...
Oh well: any cult that would have me as a member isn't exclusive enough.
Pierce: Sorry for the delay. We were busy conspiring and stuff.
You can get earlier versions of LInux to run on a Power PC. However to my knowledge this is not an architecture that is necessarily going to be supported in the future. I'm pretty sure Ubuntu stopped upgrading much of their stuff for the power PC prior to or at the beginning of version 7.
Indeed, my understanding is that Apple has dropped support or development for the power pc as well. In other words, the discussion we have been having regarding stable API's, drivers, and such, pales in comparison to Apple dropping an entire category of computer that a lot of us are not, simply, stuck with.
Power PC was never a Linux thing, so one could understand Linux dropping it as it was mainly experimental. Personally I think it is wrong for linux developers to drop the Power PC for a few reasons, no only because I happen to have one, and I run Linux on it, and I'd prefer to keep it up to date.
Having said all that, Gentoo developers made a sort of promise, or a least a bunch of hooting and hollering, about how they were going to have the best, most, etc. etc. Power PC distribution ever, for anybody, everywhere, etc. I find Gentoo Linux geeks to be rather full of it sometimes, compared to other Linux geeks (not as bad as Mac geeks, of course) so I'm not sure if I believe this or not ... but you can check it out here at their web site:
http://www.gentoo.org/proj/en/base/ppc/
I may be installing Gentoo on our old PowerPC desktop, if in fact it is a more updated version of the Kernel than Ubuntu is giving me now.
Greg -
Ah, thanks much for the info - and good luck with the conspiracy!
FYI: Uh-oh - the page you cite was, according to itself, last updated May 3 '07, and the most recent item under its news heading came from July 1 '06.
Maybe I'll have to remain unindoctrinated until I can justify buying an Intel box...
Translation to Russian: http://blog.r000n.net/2007/12/29/why-is-linux-better/
Pierce: Yes, exactly. Gentoo is the Mac of Linux, culturally. Gentoo can do no wrong, Gentoo is perfect, Gentoo is the best, Gentoo promises to always be good and wonderful. But they seem to have not kept their promise.
I'm annoyed, generally, at Debian as well, for cutting support for continued powerPC development.
I've got a power PC runing an earlier version of Ubuntu LInux and it works fine, but you can't upgrade the "virtual machines" such as flash, so it is limited. I looked into upgrading it to the latest system X but found that the latest System X is a lot like Vista in this regard: The current system has left behind somewhat older hardware. Which leaves a bad taste in my mouth about Apple, frankly.
I may have to go to plan B.