- Table of
Contents
- 1.
A Brief History of
Linux
- 2.
Linux Information
Sources
- 3.
Advanced Linux "Research
Areas"
- 4.
My Linux Projects
- 5.
Linux CPU Architecture
- 6.
Linux Networking Links
- 7.
Fannish/Advocacy Linux
Stuff
- 8.
Linux Entertainments
- 9.
Linux and Christian
Stuff
|
|
This document collects
links I have collected relating to the development
and use of the Linux Operating System.
Linux is a
UNIX-like
operating system that is largely (and increasingly)
POSIX-compliant (1).
It was first written by Linus Torvalds (see The Rampantly Unofficial Linus Torvalds
FAQ ), who started in 1991 with the idea of learning
the capabilities of an 80386 processor for task
switching. Originally named Freax , it was originally hosted
using the Minix
operating system.
Linus' complaints
with Minix were that it
was written:
-
As an
academic/educational OS
-
To be portable (8088,
68000, Sparc) using only the lowest common
denominator features.
A pretty good flamewar over this
ensued on Usenet involving such luminaries as Linus
Torvalds, Andrew
Tanembaum, Ted T'so, David Miller, and others; see
Linux is Obsolete
Both of these design assumptions
seemed crippling, sacrificing performance for academics.
The assumptions for Linux have changed slightly since then;
portability now is a goal, and it is certainly not
simply an "academic" requirement
for software. It notably moved up on the priority list when
Digital Equipment Corporation (now a
Compaq company ) gave Linus an Alpha-based system; today there are
commercial enterprises actively selling and supporting
products based on the various ports to IA-32, PowerPC, MIPS , Alpha, and ARM architectures.
Since that time thousands of people
have contributed patches, fixes, and improvements to the
kernel, adding improved performance and support for many
devices and file systems,
as well as an increasing variety of platforms.
Linus presently works for the
formerly rather secretive Transmeta Corporation.
Linus' fame has regrettably
resulted in there being something of a "personality cult" surrounding him, with
kids that I'd have to term as "worshippers."
|
It's obvious that
the "Linus personality
cult" has got to go.
|
|
--Linus
TorvaldsMay 5, 1999,
ABC News |
|
The "cult" began fairly early when
an FTP site administrator hated the name Freax and insisted on using the more
"egotistcal" name Linux. Of
course, the "worshippers" have gotten younger and more
clueless over the years...
The historically original and still
the primary platform supported is that of
IA-32
(aka"Wintel-PCs") but
there are now ports to many other CPUs. The
IA-32 , Alpha, and SPARC versions can reasonably be
regarded as "production
versions;" MkLinux
on PPC should soon
be in that form. For more architectural details see
Linux
Architecture.
Linux is being
developed under the GPL,
and thus is freely redistributable in source codeform. It
is particularly notable as an operating system truly
developed over the Internet. Many of those who have
participated in itsdevelopment are in different countries,
and most have never physically met. This "peer review" at a source
code level on an exceedingly wide scale has led to it
becoming a remarkably featureful and stable
system.
Eric Raymond has
written a widely read essay on the Linux development
process entitled
The Cathedral and the Bazaar. He describes the way
Linux kernel development uses a "Bazaar" approach where code is released as
quickly and as often as possible, and that this permits
extensive outside input that provides rapid
improvements.
The
"Bazaar" approach can be
contrasted to the "Cathedral" approach used by systems
like GNU
Emacs "core." The
"Cathedral" approach may
be characterized as resulting in more "beautiful" code being released,
unfortunately it is released far less
often. Unfortunately, it also offers less opportunity
for people outside the "core
group" to contribute to the
process.
Successful Bazaar
projects do not necessarily involve opening the code
"wide open" for everyone to play
with; at the design level, the"Cathedral" approach is fairly appropriate.
But once debugging the code, it is valuable to
"open up the bazaar" in order to
have a large number of people to find different sorts of
errors. If they can contribute fixes, that is
most
wonderful. But if they at least locate
problem areas in the code, that is still helpful to
the coders.
I have written a
further essay entitled Linux and
Decentralized Development, which describes the way that
Linux efforts are "distributed,"
and some of the merits and demerits thereof.
There is an
experimental set of OS versions (currently the
2.5.x
series) where version numbers steadily upwards on a weekly
basis; the "stable" edition
(e.g.
2.2.10) changes only as a result of bugs being found
and fixed in the "experimental"
series, and hence does not change nearly as
often.
Linux users
acknowledge system bugs, and quickly work to resolve them.
As a result, security
problems tend to be very quickly solved. There obviously is
no guarantee that all users will immediately fix
their systems if they are not being affected (or don't
notice that they are affected) by problems, but certainly
fixes are quickly available, sometimes distributed across
the Internet within hours of being detected. Fixes are
often available more quickly than commercial vendors such
as Sun, HP, and IBM can even get around to
acknowledging that there's a
problem.
This is in stark
contrast to the behaviour of some companies,
e.g. - Bill
Gates Claims "Microsoft Code has
No Bugs" At least, there are none that
Microsoft
cares to fix. Statistically, Microsoft apparently
determined that the vast majority of "bugs" represent situations where
users are not using the software correctly. The
remaining very few problems, caused by
actual errors, are statistically insignificant as
compared to the number of other kinds of problems,
and thus can safely be ignored as "not real problems."
Work is ongoing on
creating highly stable Linux systems; properly configured
Linux kernels with suitably configured software on top of
that commonly run for hundreds of days without any need to
reboot. Ensuring high availability requires careful choices
of hardware, discussed at the Linux-HA Project Web Site.
There are complaints
that "Linux is always changing."
The complaint misses the whole point of its existence.
Development on the Linux kernel will stop when people are
no longer interested in "building and
enhancing" the Linux kernel. So long as new hardware
devices such as video cards are being created and people
come up with other neat improvements to make to Linux,
development work will continue.
The project may
eventually end as a result of there being some
fundamentally better platform for "kernel hacking," or because Linux becomes
so
heavily patched that it becomes unmaintainable. I'm not
sure that we're that point yet. Moreover, there are plans
in motion to make Linux even more modular than it is now,
with various schemes for moving services out of the base
kernel and into "user
space."
The announcement of a
Debian/Hurd
effort suggests an alternative; the FSF's Hurd kernel,
which runs as a set of processes atop a microkernel such as
MACH,
may provide a more readily "hackable" system for those people that
aren't quite up to fiddling with changes to the Linux
kernel.
Mach provides a
"message passing" abstraction
that allows the OS to be constructed as a set of
"components" in somewhat the
style promoted by CORBA.
By running atop a
microkernel, Hurd should be more readily debugged than the
Linux kernel since system components may be stopped and
restarted without the need to reboot the system.