A Blast From the Past: Debian Woody

The Debian project just turned 25, which calls for celebration. Unfortunately, I'm on a pretty restrictive diet at the moment, which means that beer is out of the question. So I settled for the second-best solution: I took Debian 3.0, the first Debian version I used, out for a drive.

Debian Woody was released in July 2002, but I did not get to use it until later, in 2004. At that time, Debian had the reputation of a distribution that was hard to use and newbie-unfriendly. When I first tried Debian, Ubuntu had just had its first release. The reference for a "newbie-friendly" distribution (which I sort of needed, I had been using Linux for only two years or so) was Mandrake Linux. Debian, by contrast, was intimidating and uncompromising. No GUI by default, no fancy installer. Partitioning was done via fdisk. Ouch.

The Woody release came two years after Potato (2.2), and boasted support for several new architectures that now belong to a museum: HP PA-RISC, MIPS, IA-64. It also supported IBM S/390, whose museum-worthiness is debatable. It came with kernel 2.2.20, XFree86 4.1 (Xorg was still two years away at the time), KDE 2.2 and Gnome 1.4. The release notes contain a bunch of other cute memories: scwm was removed because it no longer had a maintainer, nano replaced ae, and people using sendmail/m4 had some special stuff to take care of (hint: look at the anchor name in the link ;-) ).

Out For a Drive

Installation was fun. It looked fun right from the beginning:

Boot screen. Debian did not get the fancy one, with a logo, until a little later, in Debian 3.1

Debian Woody used an older install script, with a structure that was slightly different from the Debian installer we know today. The install process was pretty similar, but some things (like automated partitioning) were not implemented yet, and a lot of steps that are superfluous now were still included, such as choosing which kernel modules to install. That was relevant because some modules didn't quite work well together (and udev was still one year away in 2002), and many computers were significantly space-constrained.

Other things you haven't seen in a while include LiLO, the Linux LOader:

LiLO was a cross between Grub 0.99's efficiency and Grub 2.x's ease of use.

and this dialog about whether to enable MD5 hashing for passwords or not:

This is a question that, surprisingly, many large companies that develop web-facing services cannot properly answer today.

Not included here is the next dialog in this sequence, which asked you whether you wanted to enable shadow passwords or not.

The older computer I ran this on was a Pentium III, certainly strong enough to deal with either Gnome 1.4 or KDE 2.2, but that's not what I was using (and not what I was using on my fancy Pentium IV, either). Instead, I used WindowMaker, which I continued to use for a very long time.

Can you believe it's not NeXTStep? Also pictured: sane monitor aspect ratios.

Back then, Firefox was still Mozilla Navigator. Its default theme was not boring because designers had taste back then.

Only partially historically-accurate.

Freshmeat.net was the place to learn about new software. Sadly, I cannot think of any modern-world equivalent that's worth mentioning; I think ESR attempted a reboot of it in the form of freshcode.club, but that didn't really take off.

Github was obviously not a thing back then. Many projects were self-hosted, others were on SourceForge, BerliOS or other similar hosting platforms. Freshmeat bridged these, by allowing developers to list their software in a large directory of open-source software and announce the release of new versions. In time, it grew to incorporate a lot of other things, such as UI themes for many window managers. Unfortunately, these were lost as the site declined and changed ownership -- an unfortunate event, as it hosted a lot of computer art.

Freshmeat.net was the website that you wanted to visit if you were an open source enthusiast. It's very unfortunate that it's no longer around. It contributed a great deal to the development of the open source desktop, and its decline and disappearance made many communities around more opinionated open-source projects slowly turn from enthusiastic communities into echo chambers.

This is how I really learned programming:

GNU Emacs was still a little rough around the edges.

When I first installed Linux, I already knew a little programming -- I had played around with various flavours of BASIC, with Pascal, GameMaker and even C++ on Windows, but all I had to show were some silly games and your average CS exercises (do this over a graph, do that over a linked list).

The echoes of the GNU Emacs vs. Lucid Emacs debate were pretty much in the past at the time, but in 2002, XEmacs was still more or less what you wanted. I continued to use it until 2007 or so, I think, when I (temporarily) moved to OS X and took the opportunity to switch to GNU Emacs, too, albeit in the form of Aquamacs. When I stopped spending money on overexpensive hardware two years later, there was hardly much of a choice between GNU Emacs and XEmacs anymore.

And, of course, this could not be complete without XBill:

Did you know there's a spin-off of this game called XLennart?

If you were a teenager who was using Linux, hating Bill Gates and Micro\\(oft (you *had* to spell it Micro\\)oft) was almost compulsory.

Looking Back

What did we gain since then? What did we lose?

For some uncanny reason, this has to be said right from the beginning. Linux is way, way, way better today, in every way you can possibly imagine. Things not pictured here because that's just not what a 25th anniversary article should be about include fiddling with XF86Config, rarely being able to use any affordable printer, frequent crashes from drivers (better than Windows 98, worse than Windows 2000), and features like proper support for an average soundcard being so bleeding edge that we regularly compiled our own kernels. And my favourite: a hilarious misfeature in aRts where, if a misbehaving application blocked access to the soundcard, aRts continued to buffer sounds, which it would then faithfully proceed to play as soon as it got access to the sound card again, which left you listening to about ten minutes' worth of Gaim notifications and system error sounds.

A lot of things changed significantly, and it's hard to say whether it was for the better or not. Perhaps the most important change was the change in how we distribute software. When I first installed Linux on my home machine, Red Hat 7.2 was the latest and greatest. I was on a dial-up connection that left me no chance to install large updates or download newer versions. When a version shipped, it was as polished as possible.

Today, this is rarely the case in the Western world, where fast connections mean updates can be applied almost instantly. Release early, release often, coupled with the ease of distributing updates, means that large software packages are rarely released in a consistent state. Some modules work great. Others are beta-quality at best. And in the next release, it will be the same story. This leads to much frustration, as things are always unfinished; large software compilations, like KDE and Gnome, seem to be in an eternal beta.

Some contrarians settled for alternatives, seeking stability, reliability and long-term availability in things that have always worked (i.e. text-mode applications), leaving the headaches of keeping a modern Linux desktop up and running to people who like pretty colours. But for many users, it's hard to make an argument for Windows 1.01-style desktops and text-mode applications. That an interface straight out of the 1980s is what passes for a decent alternative to modern Linux desktop environments is, I believe, a reasonable testament to the latter's lack of vision.

We did lose a few things in the process though.

The Linux desktop lost momentum. It's hard to pinpoint exactly how, when and why. After the countless hours I've spent trying to get XFCE4 to look normal in the post-GTK3 days, I'd point fingers at things like KDE 4 and Gnome 3, but that's an overly technical explanation for something that's not so tech-driven.

More likely, what happened is that the desktop in general lost momentum. Back in 2002, if you were a talented kid who wanted to make a name for himself writing something, you'd go for a window manager, a text editor, a file manager; and a desktop replacement for Windows was a potentially lucrative market, in a world where Windows 95 was barely seven years-old. In 2012, smart kids were writing web and mobile apps. This accounts for the lack of diversity and creativity in this space, too. In 2004, the activity in the desktop space was frantic and effervescent; new window managers with new paradigms, new applications, new UX models. In 2018, this field is all but dead, and largely confined to the two large desktop environments (Gnome and KDE) and a few large applications like Gimp and Krita.

But regardless of reason, the Linux desktop is unlikely to mean much anymore. With WSL making inroads on Windows, and if Apple will get their heads out of their wherever they're keeping them lately, the next Great Rewrite Rush of the Linux desktop will likely be the last one.

Free licensing became a lot less meaningful. There are a lot of devices that are powered by GPL-licensed code which you cannot change at all. Many applications, like the countless \$company Embedded Linux distributions (mostly Yocto- or buildroot-based derivatives) are little more than (cheap) free software with a magic touch of vendor lockdown.

Many FOSS communities became less welcoming. The kids that were nurtured through posturing and endless flame wars in mailing lists and forums are all grown up and write free software today. The stakes became a lot higher, too, as there is now more money in open source work, and this inherently attracts more ego. In many places, modesty is frowned upon, and the code that ends up being written is exactly what you would expect from people who never doubt that they're right. Many projects are largely corporate-backed today, which often makes contributions from independent developers difficult, and makes "community" decisions and efforts less meaningful.

What About Debian?

Debian is alive and well though. Its social contract still makes it an appropriate choice for many independent-minded users and developers, and its stability makes it an useful and well-balanced choice for workstation desktops, servers, and embedded devices.

Debian is certainly better than ever, too. And with any luck, I'll be writing another one of these on its 50th anniversary.

Happy 25th anniversary, Debian!