APC: How did you get started developing for the Linux kernel did you enjoy it and what’s the passion that drives you?
Normally this would be a straight forward question to answer. However at this time I’ve had the opportunity to reflect on what really got me into kernel development and I’ll be able to answer it more thoroughly. So if you give me the latitude to answer it I might end up answering all your potential questions as well. This is going to be my view on personal computing history.
The very first computer I owned was 24 years ago. I’ve been fortunate to have been involved in the personal computer scene since not long after it actually began. In the time since then I’ve watched the whole development of the PC first with excitement at not knowing what direction it would take then with anticipation at knowing where it would head and waiting for the developments.
In the late 1980s it was a golden era for computing. There were so many different manufacturers entering the PC market that each offered new and exciting hardware designs unique operating system features and enormous choice and competition. Sure they all shared lots of software and hardware design ideas but for the most part they were developing in competition with each other. It is almost frightening to recall that at that time in Australia the leading personal computer in numbers owned and purchased was the Amiga for a period.
|Amiga 500: cheap cheerful unique and immensely successful.|
Anyone who lived the era of the first Amiga personal computers will recall how utterly unique an approach they had to computing and what direction and advance they took the home computer to. Since then there have been many failed attempts at resuscitating that excitement. But this is not about the Amiga because it ultimately ended up being a failure for other reasons. My point about the Amiga was that radical hardware designs drove development and achieved things that software evolution on existing designs would not take us to.
At that time the IBM personal computer and compatibles were still clunky expensive glorified word processing DOS machines. Owners of them always were putting in different graphics and sound cards yearly upgrading their hardware to try and approach what was built into hardware like the Amiga and the Atari PCs.
Enter the dark era. The hardware driven computer developments failed due to poor marketing development and a whole host of other problems. This is when the software became king and instead of competing all hardware was slowly being designed to yield to the software and operating system design.
We’re all aware of what became the defacto operating system standard at the time. As a result there was no market whatsoever for hardware that didn’t work within the framework of that operating system. As a defacto operating system did take over all other operating system markets and competition failed one after the other and the hardware manufacturers found themselves marketing for an ever shrinking range of software rather than the other way around.
Hardware has since become subservient to the operating system. It started around 1994 and is just as true today 13 years later. Worse yet all the hardware manufacturers slowly bought each other out further shrinking the hardware choices. So now the hardware manufacturers just make faster and bigger versions of everything that has been done before. We’re still plugging in faster CPUs more RAM bigger hard drives faster graphics cards and sound cards just to service the operating system. Hardware driven innovation cannot be afforded by the market any more. There is no money in it. There will be no market for it. Computers are boring.
Enter Linux. We all know it started as a hobby. We all know it grew bigger than anyone ever imagined it. It would be fair to say that it is now one of the most important of the very few competing pieces of software/operating system that remains and drives development of the defacto standard — Windows. However I believe it never deserved to become this. Had the innovative hardware driven development and operating system competition continued there is no way it would have attracted as much following developers and time to evolve into what it has become. The hardware has barely changed in all that time. PCs are ludicrously powerful compared to what they were when Linux first booted in 1991 but that’s an issue of increased speed not increased functionality or innovation.
So what about the PC? Well the PC was ‘dying’ according to all accounts 20 years ago. We all know now that is crap and for the foreseeable future at least the one all encompassing information processing communication (and frustration creator) is here to stay. The internet certainly has cemented that position for the PC.
So Linux was created to service the home desktop personal computer and the PC is here to stay. For those who were looking for some excitement and enjoyment in using their computer the defacto operating system just doesn’t cut it. We want to tinker we want control we want power over everything. Or alternatively we believe in some sort of freedom or some combination of the above. So we use Linux. That is certainly how I got involved in Linux; I wanted something to use on the home desktop PC.
However the desktop PC is crap. It’s rubbish. The experience is so bloated and slowed down in all the things that matter to us. We all own computers today that were considered supercomputers 10 years ago. 10 years ago we owned supercomputers of 20 years ago.. and so on. So why on earth is everything so slow? If they’re exponentially faster why does it take longer than ever for our computers to start for the applications to start and so on? Sure when they get down to the pure number crunching they’re amazing (just encode a video and be amazed). But in everything else they must be unbelievably slower than ever.
Computers of today may be 1000 times faster than they were a decade ago yet the things that matter are slower.
The standard argument people give me in response is ‘but they do such more these days it isn’t a fair comparison’. Well they’re 10 times slower despite being 1000 times faster so they must be doing 10000 times as many things. Clearly the 10000 times more things they’re doing are all in the wrong place.
|Latest OS slow as molasses: long-time Microsoft-watcher Mary Jo Foley is scathing about the ‘molasses-like’ boot and shutdown speed of Vista even running on the latest high-end machines.|
APC: So the performance problems of Linux on the desktop became a key motivator for you?
Yes. I started to tinker with improving Linux on the desktop. “Surely if I have complete control over all the software I’ll be able to speed things up” I thought. There must be a way to tweak this tune that optimise this and get more speedups? Userspace improvements seemed so limited when I got started in Linux. It barely worked on the desktop half the time so trying to get speed out of it on top would mean that nothing worked. The UNIX legacy was evident. We were shaping an operating system never designed for the desktop and it was going to hurt… a lot.
Eventually the only places I noticed any improvements in speed were kernel developments. They were never huge but caused slightly noticeable changes in things like snappiness behaviour under CPU load and so on. The first patchset I released to the public contained none of my own code and was for kernel 2.4.18 which was about February 2002. I didn’t even know what C code looked like back then having never actually been formally taught any computer science.
So I stuck with that for a while until the 2.6 development process was under way (we were still in a 2.5 kernel at the time). I watched the development and to be honest… I was horrified. The names of all the kernel hackers I had come to respect and observe were all frantically working away on this new and improved kernel and pretty much everyone was working on all this enterprise crap that a desktop cares not about.
Even worse than that while I obviously like to see Linux run on 1024 CPUs and 1000 hard drives I loathe the fact that to implement that we have to kill performance on the desktop. What’s that? Kill performance? Yes that’s what I mean.
If we numerically quantify it with all the known measurable quantities performance is better than ever. Yet all it took was to start up an audio application and wonder why on earth if you breathed on it the audio would skip. Skip! Jigabazillion bagigamaherz of CPU and we couldn’t play audio?
Or click on a window and drag it across the screen and it would spit and stutter in starts and bursts. Or write one large file to disk and find that the mouse cursor would move and everything else on the desktop would be dead without refreshing for a minute.
I felt like crying.
I even recall one bug report we tried to submit about this and one developer said he couldn’t reproduce the problem on his quad-CPU 4GB RAM machine with 4 striped RAID array disks… think about the sort of hardware the average user would have had four years ago. Is it any wonder the desktop sucked so much?
The developers were all developing for something that wasn’t the desktop. They had all been employed by big name manufacturers who couldn’t care less about the desktop (and still don’t) but want their last 1% on their database benchmark or throughput benchmark or whatever.
Linux had won. We were now the biggest competition in the server and database market out there and all the big names cared about Linux. Money was pouring into development from all these big names into developing Linux’s performance in these areas.
The users had lost. The desktop PC for which linux started out as being development for had fallen by the wayside. Performance as home desktop users understand performance was gone. Worse yet there was no way to quantify it and the developers couldn’t care if we couldn’t prove it. The one place I found some performance was to be gained on the desktop (the kernel) was now doing the opposite.