Linux is an operating system, a software program that controls your computer. Most vendors load an operating system onto the hard drive of a PC before delivering the PC, so, unless the hard drive of your PC has failed, you may not understand the function of an operating system.
An operating system solves several problems arising from hardware variation. As you’re aware, no two PC models (or models of other computers, for that matter) have identical hardware. For example, some PCs have an IDE hard drive, whereas others have a SCSI hard drive. Some PCs have one hard drive, others have two or more. Most PCs have a CD-ROM drive, but some do not. Some PCs have an Intel Pentium CPU, whereas others have an AMD K-6, and so on. Suppose that, in a world without operating systems, you’re programming a new PC application, perhaps a new multimedia word processor. Your application must cope with all the possible variations of PC hardware. As a result, it becomes bulky and complex. Users don’t like it because it consumes too much hard drive space, takes a long time to load, and—because of its size and complexity—has more bugs than it should.
Operating systems solve this problem by providing a single standard way for applications to access hardware devices. When an operating system exists, applications can be more compact, because they share the commonly used code for accessing the hardware. Applications can also be more reliable because this code is written only once, and by expert programmers, rather than by application programmers.
As you’ll soon learn, operating systems do many other things as well; for example, they generally provide a filesystem so that you can store and retrieve data, and a user interface so that you can control the operation of your computer. However, if you think of a computer’s operating system as its subconscious mind, you won’t be far off the mark. It’s the computer’s conscious mind—applications such as word processors and spreadsheets—that do useful work. But, without the subconscious—the operating system—the computer would cease breathing and applications would not function.
Now that you know what an operating system is, you may be wondering what operating system your PC uses. Chances are, your PC operating system was provided by Microsoft. Table 1.1 shows the sales of several popular desktop operating systems during 1997 and projected sales for 2001.[1] Bear in mind that, because Linux is a free operating system, Linux sales are a mere fraction of Linux installations. Moreover, unlike most commercial operating systems, Linux is not sold under terms of a per-seat license; a company is free to purchase a single Linux CD-ROM diskette and install Linux on as many computer systems as they like.
Table 1-1. Sales of Popular Desktop Operating Systems
Operating System |
1997[a] |
2001 (est.)[b] |
---|---|---|
Windows 95/98 |
69.4% |
65.0% |
Windows NT Workstation |
9.2 |
26.2 |
DOS with Windows 3.x |
7.7 |
0.3 |
MacOS |
4.6 |
1.9 |
Linux |
2.4 |
4.2 |
DOS without Windows |
2.3 |
0.3 |
Unix |
1.0 |
0.5 |
OS/2 Warp |
0.8 |
1.2 |
Other |
2.7 |
0.5 |
[a] U.S. sales of desktop operating systems as percent of market. [b] Includes IBM, Digital Research (DR), and Microsoft versions of DOS. |
As the table shows, your desktop computer is probably running Microsoft Windows 95 or Windows 98, which together accounted for over 69% of 1997 sales. The sales of Linux were miniscule in comparison: a mere 2.4%. As explained, these figures don’t do full justice to the ubiquity of Linux. Nevertheless, notice that sales of Linux are expected to almost double, whereas those of Windows 95/98 are expected to slightly contract.
Later in this chapter you’ll learn how Linux is distributed, but recall that Linux was termed a free operating system. If you have a high-speed Internet connection, you can download, install, and use Linux without paying anyone for anything (except perhaps your Internet Service Provider, who may impose a connection fee). It’s anyone’s guess how many people have downloaded Linux, but estimates indicate that between 7 and 10 million computers run Linux.
Moreover, many Linux users run Linux not as a desktop computer but as a server, which is powered up and online 24 hours per day, connected (at least occasionally) to the Internet, and ready to provide services to requesting clients. For example, many Linux users run web servers, hosting web sites browsed by users worldwide. But, the number of desktop Linux users—those who power on their computer to use it and power it off when they’re done—is rising.
Desktop use of Linux is the focus of this book. However, if you’re unfamiliar with Linux and Unix, this book is right for you even if you plan to establish a Linux server. This book will take you through the basics of setting up and using Linux. After you’ve mastered what this book offers, you should consult Running Linux, Third Edition, Matt Welsh, Matthias Kalle Dalheimer and Lar Kaufman (O’Reilly, 1999), a more advanced book that focuses on setting up and using Linux servers.
Linux is distinguished from many popular operating systems in three important ways.
Linux is a cross-platform operating system that runs on many computer models. Only Unix, an ancestor of Linux, rivals Linux in this respect. In comparison, Windows 95 and Windows 98 run only on CPUs having the Intel architecture. Windows NT runs only on CPUs having the Intel architecture or the DEC Alpha.
Linux is free, in two senses. First, you may pay nothing to obtain and use Linux. On the other hand, you may choose to purchase Linux from a vendor who bundles Linux with special documentation or applications, or who provides technical support. However, even in this case, the cost of Linux is likely to be a fraction of what you’d pay for another operating system. So, Linux is free or nearly free in an economic sense.
Second, and more important, Linux and many Linux applications are distributed in source form. This makes it possible for you and others to modify or improve them. You’re not free to do this with most operating systems, which are distributed in binary form. For example, you can’t make changes to Microsoft Windows or Microsoft Word—only Microsoft can do that. Because of this freedom, Linux is being constantly improved and updated, far outpacing the rate of progress of any other operating system. For example, Linux will likely be the first operating system to support Intel’s forthcoming Merced 64-bit CPU.
Linux has attractive features and performance. Free access to Linux source code lets programmers around the world implement new features, and tweak Linux to improve its performance and reliability. The best of these features and tweaks are incorporated in the standard Linux kernel or made available as kernel patches or applications. Not even Microsoft can mobilize and support a software development team as large and dedicated as the volunteer Linux software development team, which numbers in the hundreds of thousands, including programmers, code reviewers, and testers.
Linux traces its ancestry back to a mainframe operating system known as Multics (Multiplexed Information and Computing Service). Begun in 1965, Multics was one of the first multi-user computer systems and remains in use today. Bell Telephone Labs participated in the development of Multics, along with the Massachusetts Institute of Technology and General Electric.
Two Bell Labs software engineers, Ken Thompson and Dennis Richie, worked on Multics until Bell Labs withdrew from the project in 1969. One of their favorite pastimes during the project had been playing a multi-user game called Space Travel. Now, without access to a Multics computer, they found themselves unable to indulge their fantasies of flying around the galaxy. Resolved to remedy this, they decided to port the Space Travel game to run on an otherwise unused PDP-7 computer. Eventually, they implemented a rudimentary operating system they named Unics, as a pun on Multics. Somehow, the spelling of the name became Unix.
Their operating system was novel in several respects, most notably portability. Most previous operating systems had been written for a specific target computer. Just as a tailor-made suit fits only its owner, such an operating system could not be easily adapted to run on an unfamiliar computer. In order to create a portable operating system, Ritchie and Thompson first created a programming language, called C. Like assembly language, C let a programmer access low-level hardware facilities not available to programmers writing in a high-level language such as FORTRAN or COBOL. But, like FORTRAN and COBOL, a C program was not bound to a particular computer. Just as a ready-made suit can be lengthened or shortened here and there to fit a purchaser, writing Unix in C made it possible to easily adapt Unix to run on computers other than the PDP-7.
As word of their work spread and interest grew, Ritchie and Thompson made copies of Unix freely available to programmers around the world. These programmers revised and improved Unix, sending word of their changes back to Ritchie and Thompson, who incorporated the best such changes in their version of Unix. Eventually, several Unix variants arose. Prominent among these was BSD (Berkeley Systems Division) Unix, written at the University of California, Berkeley, in 1978. Bill Joy, one of the principals of the BSD project, later became a founder of Sun Microsystems, which sold another Unix variant (SunOS) to power its workstations. In 1984, AT&T, the parent company of Bell Labs, began selling its own version of Unix, known as System V.
What Ritchie and Thompson had begun in a distinctly non-commercial fashion ended up spawning several legal squabbles. When AT&T grasped the commercial potential of Unix, it claimed Unix as its intellectual property and began charging a hefty license fee to those who wanted to use its Unix. Soon, others who had implemented Unix-like operating systems were distributing licenses only for a fee. Understandably, those who had contributed improvements to Unix considered it unfair for AT&T and others to appropriate the fruits of their labors. This concern for profit was unlike the democratic, share-and-share-alike spirit of the early days of Unix.
Some, including MIT scientist Richard Stallman, yearned for the return of those happier times and the mutual cooperation of programmers that then existed. So, in 1983, Stallman launched the GNU (GNU’s not Unix) project, which aimed at creating a free Unix-like operating system. Like early Unix, the GNU operating system was to be distributed in source form so that programmers could read, modify, and redistribute it without restriction. Stallman’s work at MIT had taught him that, by using the Internet as a means of communication, programmers the world over could improve and adapt software at incredible speed, far outpacing the fastest rate possible using traditional software development models, in which few programmers actually see one another’s source code.
As a means of organizing work on the GNU project, Stallman and others created the Free Software Foundation (FSF), a non-profit corporation that seeks to promote free software and eliminate restrictions on the copying, redistribution, understanding, and modification of software. Among other activities, the FSF accepts tax-deductible charitable contributions and distributes copies of software and documentation for a small fee, using this revenue to fund its operations and support the GNU project.
If you find it peculiar that the FSF charges a fee—even a small fee—for “free” software, you should understand that the FSF intends the word free to refer primarily to freedom, not price. The FSF believes in three fundamental software freedoms:
You can copy GNU software and give it away to anyone you choose.
If you’re a programmer, you can modify GNU software any way you like, because you have access to the source code.
You can distribute improved versions of GNU software. However, you cannot charge anyone a fee for using your improved version (although you can charge a fee for providing a user with a physical copy of your software).
Commercial software vendors protect their proprietary rights to software by copyrighting the software. In contrast, the FSF protects software freedom by copylefting its software. If the FSF placed its software in the public domain, others would be free to transform it into a proprietary product, denying users the freedom intended by the original author of the software. For example, a company might distribute the software in binary rather than source form and require payment of a license fee for the privilege of making additional copies.
To copyleft software, the FSF uses the same legal instrument used by proprietary software vendors—the copyright—but the FSF adds special terms that guarantee freedom to users of the software. These terms, referred to as the GNU Public License, give everyone the right to use, modify, and redistribute the software (or any software derived from it), but only if the distribution terms are unchanged. Thus someone who attempts to transform FSF software into a proprietary product has no right to use, modify, or distribute the product. As the FSF puts it, “Proprietary software developers use copyright to take away the users’ freedom; we use copyright to guarantee their freedom. That’s why we reverse the name, changing copyright into copyleft.”
By the early 1990s, the FSF had obtained or written all the major components of the GNU operating system except for one: the kernel. About that time, Linus Torvalds, a Finnish computer science student, began work on a kernel for a Unix-like system. Linus had been working with Minix, a Unix-like operating system written by Andrew Tannenbaum primarily for pedagogical use. Linus was disappointed by the performance of the Minix kernel and believed that he could do better. He shared his preliminary work with others on Internet newsgroups. Soon, programmers around the world were working together to extend and improve his kernel, which he called Linux (for Linus’s Minix). As Table 1.2 shows, Linux grew rapidly. Within three years of its October 5, 1991 initial release, Linux was released as production software; version 1.0 was released in March of 1994. However, as early as 1992, Linux had been integrated with other GNU software to produce a fully functional operating system, which took as its name the name of its kernel.
Table 1-2. The History of Linux
Year |
Version |
Users |
Kernel size (Bytes) |
Milestone(s) |
---|---|---|---|---|
1991 |
0.01 |
100 |
63,362 |
Linus Torvalds writes Linux kernel |
1992 |
0.99 |
1000 |
431,591 |
GNU software integrated with Linux kernel, producing a fully functional operating system |
1993 |
0.99 |
20,000 |
937,917 |
High rate of code contributions prompts Linus to delegate code review responsibility |
1994 |
1.0 |
100,000 |
1,016,601 |
First production release |
1995 |
1.2 |
500,000 |
1,850,182 |
Linux adapted to non-Intel processors |
1996 |
2.0 |
1,500,000 |
4,718,270 |
Linux supports multiple processors, IP masquerading, and Java |
1999 |
2.2 |
7,500,000 |
10,600,000[a] |
Linux growth rate exceeds that of Microsoft Windows NT |
[a] estimated |
However, work on Linux did not cease. Since the initial production release, the pace of development has accelerated as Linux has been adapted to include support for non-Intel processors and even multiple processors, sophisticated TCP/IP networking facilities such as IP masquerading, and more. Versions of Linux are now available for such computer models as the Apple PowerPC, the DEC Alpha, the Motorola 68k, the Sun SPARC, the Mips, and many others. Moreover, Linux does not implement an obscure Unix variant: it generally complies with the POSIX (Portable Operating System Interface) standard that forms the basis of the X/Open specifications of The Open Group.
Another important component of Linux is its graphical user interface, the X Window System. Unix was originally a mouseless, text-based system that used noisy teletype machines rather than modern CRT monitors. The Unix command interface is very sophisticated and, even today, some power users prefer it to a point-and-click graphical environment, using their CRT monitor as though it were a noiseless teletype. Consequently, some remain unaware that Unix long ago outgrew its text-based childhood, and now provides users a choice of graphical or command interfaces.
The X Window System (or simply X) was developed as part of the Massachusetts Institute of Technology’s (MIT) Project Athena, which it began in 1984. By 1988, MIT released X to the public. MIT has since turned development of X over to the X Consortium, which released version 6 in September 1995.
X is a unique graphical user interface in two major respects. First, X integrates with a computer network, letting users access local and remote applications. For example, X lets you open a window that represents an application running on a remote server: the remote server does the heavy-duty computing; all your computer need do is pass the server your input and display the server’s output.
Second, X lets you configure its look and feel to an amazing degree. To do so, you run a special application—called a window manager—on top of X. A variety of window managers is available, including some that closely mimic the look and feel of Microsoft Windows.
Because Linux can be freely redistributed, you can obtain it in a variety of ways. Various individuals and organizations package Linux, often combining it with free or proprietary applications. Such a package that includes all the software you need to install and run Linux is called a Linux distribution. Table 1.3 shows some of the most popular Linux distributions.
Table 1-3. Popular Linux Distributions and Their Web Home Pages
Distribution |
Home Page |
---|---|
Caldera OpenLinux | |
Debian Linux | |
Slackware Linux | |
Red Hat Linux | |
SuSE. Linux |
Caldera, Red Hat, Slackware, and SuSE are packaged by commercial companies, which seek to profit by selling Linux-related products and services. However, because Linux is distributed under the GNU GPL, you can download these distributions from the respective companies’ web sites or make additional copies of a Linux distribution you purchase from them. (Note, however, that you cannot necessarily make additional copies of proprietary software that these companies may distribute with their Linux distribution.) Debian Linux is the product of volunteer effort conducted under the auspices of Software In The Public Interest, Inc., a non-profit corporation. This book is bundled with a copy of Linux, which you can install and run on your PC.
The origins of Linux and the availability of its source code set it apart from other operating systems. But most users choose an operating system based on features and performance—and Linux delivers these in spades. Table 1.4 compares certain features and performance characteristics of a specific Linux distribution—Red Hat Linux 5.1—with those of Microsoft Windows NT 4.0 and Sun Microsystems Solaris 2.6.[2] Each of these three operating systems can be run on an Intel-architecture PC.
Table 1-4. Linux Features and Performance Comparison
Characteristic |
Red Hat Linux |
Windows NT |
Solaris |
---|---|---|---|
Range of compatible hardware |
Very wide |
Modest |
Narrow |
Minimal hardware |
386 PC |
486 PC |
Pentium |
Representative cost of hardware |
$200 |
$1300 |
$1600 |
Average downtime |
Very low |
As low as 30 min./week |
Very low |
Performance |
High |
Comparable to Linux |
Half of Linux to same as Linux |
Multi-processing capabilities |
Excellent |
Modest |
Excellent |
IP Security (IPSec) |
Yes |
Planned |
1999 |
IPv6 |
Available |
Privately demonstrated |
Beta |
Overall user satisfaction, per Datapro |
Highest |
Lowest |
Medium |
Source code readily available |
Yes |
No |
No |
Installed base |
Millions |
Millions |
Hundreds of thousands |
As you can see, Linux fares well in this comparison. It runs on a wider range of hardware platforms and runs adequately on less costly and powerful systems. Moreover, the typical downtime of a Linux system is less than that of a Windows NT system and its performance surpasses that of a Solaris system. Its multi-processing capabilities exceed those of Windows NT and its support of advanced TCP/IP networking facilities is superior to that of Windows NT and Solaris. As a group, Linux users are more satisfied than Windows NT users and Solaris users. Linux source code is readily available. And, the Linux installed base dwarfs that of Solaris and approaches that of Windows NT.
But this impressive inventory of selling points doesn’t end the matter. Let’s consider some other technical characteristics of Linux that distinguish it from the pack. Foremost in the minds of many is the low cost of Linux. Comparable server operating systems can cost more than $100,000. The low cost of Linux makes it practical for use even as a desktop operating system. In that mode, it truly eclipses the competition.
Many desktop systems are occasionally, even regularly, employed as servers. Because Linux was designed for use as a server operating system, its features and performance readily outshine those of desktop operating systems used as makeshift servers. For example, Microsoft’s software license for Windows NT Workstation restricts the number of simultaneous client connections to 10; if your Windows NT Workstation computer accepts more than 10 client connections, it is operating in breach of license. However, Linux imposes no such restriction: your Linux desktop is free to accept as many client connections as you think it can handle.
Again, because it was designed as a server, Linux provides more reliable data storage than competing desktop operating systems. Most Linux users store their disk data using the EXT2 filesystem, which is superior in performance and reliability to filesystems (partition types) provided by Microsoft operating systems, including FAT, FAT32, and NTFS. Of course, Microsoft claims that its NTFS filesystem is so reliable that you’ll probably never need to use special software tools to recover lost data—truth is, Microsoft provides no such tools. Despite Microsoft’s ambitious claims, users report that NTFS reliability is not perfect. Here’s a case in point:
When my Windows NT Workstation computer crashed a little over a year ago, I discovered that its NTFS file system was damaged. I searched the Microsoft web site for recovery instructions and tools and found nothing that helped. So, I went to my local software store and purchased a third party disk recovery tool for Windows NT. When I opened the box, I was angered to discover that it supported recovery of FAT and FAT32 data, but not NTFS data.
Eventually, I recovered 95 percent of my data by using a free Linux utility that was able to open the damaged NTFS partition and copy its files. If I’d been without Linux, I’d be without my data.
Like other server operating systems, Linux provides advanced disk management (RAID), which makes it possible to automatically duplicate stored data on several hard drives. This greatly improves the reliability of data storage; if one hard drive fails, the data can be read from another. Competing desktop operating systems such as Microsoft Windows 95/98 do not support this capability (though several third parties sell drivers that let you add this capability to your desktop operating system).
If you’re an old computer dog who remembers the days of MS-DOS, you may have a fondness for what’s now called the MS-DOS Prompt window. However, if you’ve worked exclusively within the Microsoft Windows point-and-click environment, you may not fully understand what the MS-DOS Prompt window is about. The MS-DOS Prompt window provides what’s called a command-line interface. By typing commands, chosen from a list of commands the operating system understands, you can direct the computer to perform a variety of tasks.
For most users, the command interface is not as convenient as the point-and-click interface offered by Microsoft Windows. That’s because you must know the commands the operating system understands, and must type them correctly, if you expect the operating system to do your bidding.
However, the MS-DOS Prompt window lets you accomplish tasks that would be cumbersome and time-consuming if performed by pointing and clicking. Linux comes with a similar command interface, known as the shell. But, the word “similar” fails to do justice to the Linux shell’s capabilities, because the MS-DOS command line provides a fraction of the capabilities provided by the Linux shell.
In particular, the MS-DOS command line lacks many ease-of-use features found in the Linux shell. You may have used the MS-DOS command line and, finding it distastefully cumbersome, forever rejected it in favor of pointing and clicking. If so, you’ll be pleasantly surprised to see how easy it is to use the Linux shell. You’ll certainly be pleased—perhaps amazed—by the enormous power it offers. You’ll learn more about the Linux shell in Chapter 4.
If you’re a programmer, you’ll also admire the ease with
which it’s possible to develop portable, Unix-compliant
software by using Linux. Linux comes with a complete suite
of software development tools, including an assembler, C
compiler, C++ compiler, make
application,
and source code librarian. All of these are freely
distributable programs made available under the terms of the
GNU GPL.
Get Learning Red Hat Linux now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.