What Is Linux?

Linux is an operating system, a software program that controls your computer. Most PC vendors load an operating system—generally, Microsoft Windows—onto the hard drive of a PC before delivering the PC; so, unless the hard drive of your PC has failed or you’ve upgraded your operating system, you may not understand the function of an operating system.

An operating system handles user interaction with a system and provides a comfortable view of the system. In particular, it solves several problems arising from variation among hardware. As you’re aware, no two PC models have identical hardware. For example, some PCs have an IDE hard drive, while others have a SCSI hard drive. Some PCs have one hard drive; others have two or more. Most PCs have a CD-ROM drive, but some do not. Some PCs have an Intel Pentium CPU, while others have an AMD Athlon, and so on.

Suppose that, in a world without operating systems, you’re programming a new PC application—perhaps a new multimedia word processor. Your application must cope with all the possible variations of PC hardware. As a result, it becomes bulky and complex. Users don’t like it because it consumes too much hard drive space, takes a long time to load, and—because of its size and complexity—has more bugs than it should. Operating systems solve this problem by providing a standard way for applications to access hardware devices. Thanks to the operating system, applications can be more compact, because they share the commonly used code for accessing the hardware. Applications can also be more reliable, because common code is written only once—and by expert systems programmers rather than by application programmers.

As you’ll soon learn, operating systems do many other things as well; for example, they generally provide a filesystem so you can store and retrieve data and a user interface so you can control your computer. However, if you think of a computer’s operating system as its subconscious mind, you won’t be far off the mark. It’s the computer’s conscious mind—applications such as word processors and spreadsheets—that do useful work. But, without the subconscious—the operating system—the computer would cease breathing and applications would not function.

Desktop and Server Operating Systems

Now that you know what an operating system is, you may be wondering what operating systems other PC users are using. According to the market research firm IDC, Microsoft products account for over 90 percent of sales of desktop operating systems. Because Linux is a free operating system, Linux sales are a mere fraction of actual Linux installations. Unlike most commercial operating systems, Linux is not sold under terms of a per-seat license; a company is free to purchase a single Linux CD-ROM and install Linux on as many systems as they like.[1] So, sales figures understate the popularity of Linux. Moreover, it’s important to consider who uses a product and what they use it for, rather than merely the number of people using it. Linux is particularly popular among power users who run web sites and databases and write their own code. Hence, though Linux is popular, its influence is even greater than its popularity suggests.

Later in this chapter you’ll learn how Linux is distributed, but notice that Linux was termed a free operating system. If you have a high-speed Internet connection, you can download, install, and use Linux without paying anyone for anything (except perhaps your Internet Service Provider, who may impose a connection fee). It’s anyone’s guess how many people have downloaded Linux, but it appears that about 10 million computers now run Linux.

This book focuses on how Linux can be used on the desktop. However, if you plan to set up a Linux server and are unfamiliar with Linux and Unix, this book is a great starting point.

This book will take you through the basics of setting up and using Linux as a desktop system. After you’ve mastered what this book offers, you should consult Running Linux, by Matt Welsh, Matthias Kalle Dalheimer Terry Dawson, and Lar Kaufman (O’Reilly & Associates, Inc.), a more advanced book that focuses on setting up and using Linux servers. You might also enjoy Linux in a Nutshell, by Ellen Siever, Stephen Figgins, and Aaron Weber (O’Reilly); this book puts useful Linux reference information at your fingertips.

How Linux Is Different

Linux is distinguished from other popular operating systems in three important ways:

  • Linux is a cross-platform operating system that runs on many computer models. Only Unix, an ancestor of Linux, rivals Linux in this respect. In comparison, Windows 2000 and XP run only on CPUs having the Intel architecture.

  • Linux is free, in two senses. First, as mentioned earlier, you can obtain and use Linux without paying anything to anybody. On the other hand, you may choose to purchase Linux from a company that bundles Linux with special documentation or applications or provides technical support.

    Second, and more important, Linux and many Linux applications are distributed in source form. Thus, if you have the necessary skill, you’re free to modify or improve them. You’re not free to do this with most operating systems, which are distributed in binary form. For example, you can’t make changes to Windows or Office—only Microsoft can do that. Because of this freedom, Linux is being constantly improved and updated, far outpacing the rate of progress of any other operating system. For example, Linux was the first operating system to support Intel’s Itanium 64-bit CPU.

  • Linux has more attractive features and performance. Free access to Linux source code lets programmers around the world implement new features and tweak Linux to improve its performance and reliability. The best of these features and tweaks are incorporated in the Linux kernel or made available as kernel patches or applications. Not even Microsoft can mobilize and support a software development team as large and dedicated as the volunteer Linux software development team, which numbers in the hundreds of thousands, including programmers, code reviewers, and testers.

The Origins of Linux

Linux traces its ancestry back to a mainframe operating system known as Multics (Multiplexed Information and Computing Service). Multics was one of the first multiuser computer systems and is still in use today. Participating in its development, which began in 1965, was Bell Telephone Labs, along with the Massachusetts Institute of Technology (MIT) and General Electric.

Two Bell Labs software engineers, Ken Thompson and Dennis Ritchie, worked on Multics until Bell Labs withdrew from the project in 1969. One of their favorite pastimes during the project was playing a multiuser game called Space Travel. Without access to a Multics computer, they found themselves unable to indulge their fantasies of flying around the galaxy. Resolving to remedy this, they decided to port the Space Travel game to run on an otherwise unused PDP-7 computer. Eventually, they implemented a rudimentary operating system they named Unics, as a pun on Multics. Somehow, the spelling of the name became Unix.

Their operating system was novel in several respects, most notably its portability. Most previous operating systems had been written for a specific target computer. Just as a tailor-made suit fits only its owner, such an operating system could not be easily adapted to run on an unfamiliar computer. In order to create a portable operating system, Ritchie and Thompson first created a programming language called C. Like assembly language, C let a programmer access low-level hardware facilities not available to programmers writing in a high-level language such as FORTRAN or COBOL. But, like FORTRAN and COBOL, a C program was not bound to a particular computer. Just as a ready-made suit can be altered here and there to fit a purchaser, writing Unix in C made it possible to easily adapt Unix to run on computers other than the PDP-7.

As word of their work spread and interest grew, Ritchie and Thompson made copies of Unix freely available to programmers around the world. These programmers revised and improved Unix, sending word of their changes back to Ritchie and Thompson, who incorporated the best improvements in their version of Unix. Eventually, several Unix variants arose. Prominent among these was BSD (Berkeley Systems Division) Unix, written at the University of California, Berkeley, in 1978. Bill Joy—one of the principals of the BSD project—later became a founder of Sun Microsystems, which sold another Unix variant (originally called SunOS and later called Solaris) to power its workstations. In 1984, AT&T, the parent company of Bell Labs, began selling its own version of Unix, known as System V.

Free Software

What Ritchie and Thompson began in a distinctly noncommercial fashion ended up spawning several legal squabbles. When AT&T grasped the commercial potential of Unix, it claimed Unix as its intellectual property and began charging a hefty licensing fee to those who wanted to use it. Soon, others who had implemented Unix-like operating systems were distributing licenses only for a fee. Understandably, those who had contributed improvements to Unix considered it unfair for AT&T and others to appropriate the fruits of their labors. This concern for profit was at odds with the democratic, share-and-share-alike spirit of the early days of Unix.

Some, including MIT scientist Richard M. Stallman, yearned for the return of those happier times and the mutual cooperation of programmers that had existed. So, in 1983, Stallman launched the GNU (GNU’s not Unix) project, which aimed at creating a free Unix-like operating system. Like early Unix, the GNU operating system was to be distributed in source form so that programmers could read, modify, and redistribute it without restriction. Stallman’s work at MIT had taught him that, by using the Internet as a means of communication, programmers could improve and adapt software at incredible speed, far outpacing the fastest rate possible using traditional software development models, in which few programmers actually see one another’s source code.

As a means of organizing work on the GNU project, Stallman and others created the Free Software Foundation (FSF), a nonprofit corporation that seeks to promote free software and eliminate restrictions on the copying, redistribution, understanding, and modification of software. Among other activities, the FSF accepts tax-deductible charitable contributions and distributes copies of software and documentation for a small fee, using this revenue to fund its operations and support development activities.

If you find it peculiar that the FSF charges a fee—even a small fee—for “free” software, you should understand that the FSF intends the word free to refer primarily to freedom, not price. The FSF believes in three fundamental software freedoms:

  • You can copy GNU software and give it away to anyone you choose.

  • If you’re a programmer, you can modify GNU software any way you like, because you have access to the source code. In return, your modified code should be available for others so they can enjoy the privileges of learning from and modifying it.

  • You must provide a free copy of the source so that you cannot unfairly profit by changing the original.

The Linux Kernel

By the early 1990s, the FSF had obtained or written all the major components of the GNU operating system except for one: the kernel. About that time, Linus Torvalds, a Finnish computer science student, began work on a kernel for a Unix-like system. Linus had been working with Minix, a Unix-like operating system written by Andrew Tannenbaum primarily for pedagogical use. Linus was disappointed by the performance of the Minix kernel and believed that he could do better. He shared his preliminary work with others on Internet newsgroups. Soon, programmers around the world were working together to extend and improve his kernel, which became known as Linux (for Linus’s Minix). As Table 1-1 shows, Linux grew rapidly. Linux was initially released on October 5, 1991, and as early as 1992, Linux had been integrated with GNU software and other open source software (http://www.opensource.org) to produce a fully functional operating system, which became known as Linux after the name of its kernel.

Table 1-1. The history of Linux

Version

Year

Estimated users

Kernel size (Kb)

Milestone(s)

0.01

1991

100

63

Linus Torvalds writes the Linux kernel.

0.99

1992

1000

431

GNU software is integrated with the Linux kernel, producing a fully functional operating system.

0.99

1993

20,000

938

High rate of code contributions prompts Linus to delegate code review responsibility.

1.0

1994

100,000

1017

First production kernel is released.

1.2

1995

500,000

1850

Linux is ported to non-Intel processors.

2.0

1996

1,500,000

4718

Linux supports multiple processors, IP masquerading, and Java.

2.2

1999

7,500,000

10,593

Linux growth rate exceeds that of Windows NT, according to market research firm Dataquest.

2.4

2001

10,000,000

19,789

Linux invades the enterprise as major companies begin using it.

2.6

2003

20-50,000,000

32,476

Linux improved to include additional enterprise-level features, such as improved support for threads.

Work on Linux has not ceased. Since the initial production release, the pace of development has accelerated as Linux has been adapted to include support for non-Intel processors and even multiple processors, sophisticated TCP/IP networking facilities such as firewalling, network address translation (NAT), and more. Versions of Linux are now available for such computer models and architectures as the PowerPC, the Compaq/DEC Alpha, the Motorola 68k, the Sun SPARC, the MIPS, IBM mainframes, and many others. Moreover, Linux does not implement an obscure Unix variant: it generally complies with the POSIX (Portable Operating System Interface) standard that forms the basis of the X/Open specifications of The Open Group.

The X Window System

Another important component of Linux is its graphical user interface (GUI; pronounced gooey), the X Window System. Unix was originally a mouse-less, text-based system that used noisy teletype machines rather than modern video monitors. The Unix command interface is very sophisticated and, even today, some power users prefer it to a point-and-click graphical environment, using their video monitors as though they are noiseless teletypes. Consequently, some remain unaware that Unix long ago outgrew its text-based childhood and now provides users a choice of graphical or command interfaces.

The X Window System (or simply X) was developed as part of MIT’s Project Athena, which it began in 1984. By 1988, MIT released X to the public. Responsibility for X has since been transferred to The Open Group. The XFree86 Project, Inc. provides a freely redistributable version of X that runs on Intel-architecture PCs.

X is a unique graphical user interface in three major respects:

  • X integrates with a computer network, letting users access local and remote applications. For example, X lets you open a window from which you can interact with an application running on a remote host: the remote host does the heavy-duty computing; all your computer need do is pass the host your input and display the resulting output.

  • X lets you configure its look and feel to an amazing degree. To do so, you run a special application—called a window manager—on top of X. A variety of window managers is available, including some that closely mimic the look and feel of Microsoft Windows. Desktop managers further extend X by providing common applications such as file browsers, menus, and control panels. GNOME and KDE are the most popular Linux desktop managers and are discussed in this book.

  • X is optional. Systems used as servers are often configured without a GUI, saving resources to serve client requests.

Linux Distributions

Because Linux can be freely redistributed, you can obtain it in a variety of ways. Various individuals and organizations package Linux, often combining it with free or proprietary applications. Such a package that includes all the software you need to install and run Linux is called a Linux distribution. Table 1-2 shows some of the most popular Linux distributions.

Table 1-2. Popular Linux distributions and their home pages

Distribution

Home page

Debian GNU/Linux

http://www.debian.org

Fedora Core

http://fedora.redhat.com

Gentoo Linux

http://www.gentoo.org

Mandrake Linux

http://www.mandrakelinux.com

Red Hat Enterprise Linux

http://www.redhat.com

Slackware Linux

http://www.slackware.com

SuSE Linux

http://www.suse.com

Red Hat Enterprise Linux, Mandrake Linux, SuSE, and Slackware are packaged by commercial companies, which seek to profit by selling Linux-related products and services. However, because Linux is distributed under the GNU GPL, you can download the source code related to these distributions from the respective companies’ web sites and make additional copies. (Note, however, that you cannot necessarily make additional copies of proprietary software that these companies may distribute with their Linux distribution.) Debian GNU/Linux is the product of volunteer effort conducted under the auspices of Software in the Public Interest, Inc. (http://www.spi-inc.org), a nonprofit corporation. Fedora Core and Gentoo are also the products of volunteer efforts. However, Red Hat provides significant support to the team responsible for Fedora Core.

Red Hat’s Linux Distributions

Red Hat formerly provided a single series of Linux distributions known as Red Hat Linux. Red Hat Linux included distinct offerings for various uses (for instance, workstations versus servers), but the offerings were all based on one stable core. Now Red Hat provides several Linux distributions, and it’s worth understanding the strengths and weaknesses of each. At the same time, remember that the distributions all include essentially the same software.

Red Hat came to recognize that its distribution interests two very different types of people. One type is business clients who want stable software that comes with support and are willing to pay significant money for that support. If the software changes slowly, that is fine with such users because it means fewer wrinkles and less time spent on upgrades. The other type of people who use Red Hat love to experiment and make use of the newest, most advanced features of Linux. These people don’t mind upgrading frequently, but they are not running mission-critical systems and don’t want to pay for support. In fact, they’re used to getting Linux at no cost.

Red Hat decided that by offering multiple distributions, it could make both types of users happy—and get revenue from its efforts—while benefiting from the testing and experimentation of the user community. Thus, any software that bears the trademark “Red Hat” is licensed software. It is backed by Red Hat in some manner, often through support contracts that include upgrades. It is expected to change only once every two or three years. By contrast, the freely downloadable version of the distribution is called “Fedora Core” or simply “Fedora.” Fedora comes without Red Hat support. New versions of Fedora are released every few months and contain a lot of new features.

Right now, the Red Hat Enterprise Linux distribution and Fedora are very similar. To readers of this book, they essentially work the same way, and only some details of installation differ. We include the publisher’s edition of Fedora Core on two CDs, but we have tested the material thoroughly with both Fedora Core and Red Hat Enterprise Linux, so the book applies to both.

Naturally, if you install another version of either system, you may find minor differences between the menus or software versions in this book and the ones on your system. However, most of the things described in this book have been stable for some time and are not likely to change quickly.

Despite the outward similarity between Red Hat Enterprise Linux and Fedora Core, these distributions reflect divergent goals and methods. The Fedora Core project team promises 2-3 releases per year of their distribution, a rapid development pace that will enable them to incorporate the most recent available versions of Linux software in a timely manner. However, Fedora Core’s rapid release cycle does not afford opportunity to perform quality assurance appropriate for enterprise environments, where reliability and security are crucial. Moreover, updates to a given release of Fedora Core are promised to be available only until 2-3 months after a subsequent release. Thus, users of Fedora Core can expect to have to install a new release of Fedora Core every 6-9 months or leave their systems vulnerable to security flaws for which updates are not available. Consequently, Fedora Core is best suited for use by hobbyists and others interested in sampling the bleeding edge of Linux technology and features.

Red Hat Enterprise Linux has a longer release cycle of 12-18 months. And, updates to releases of Red Hat Enterprise Linux are promised to be available for five years after each release. Consequently, users of Red Hat Enterprise Linux will not need to re-install their operating system as often as users of Fedora Core. Moreover, the longer release cycle of Red Hat Enterprise Linux permits third-party certification of Red Hat Enterprise Linux applications and hardware vendor support of systems running Red Hat Enterprise Linux, considerations which are important, even crucial, to enterprise users. However, the long release cycle is not without its drawbacks. For instance, at any given time, several releases of Red Hat Enterprise Linux may be in use. So, if you’re using a release of Red Hat’s Enterprise Linux other than Red Hat Enterprise Linux WS 3, you’ll probably find some differences between what’s installed on your system and what’s shown in this book.

Linux Features and Performance

The origins of Linux and the availability of its source code set it apart from other operating systems. But most users choose an operating system based on features and performance—and Linux delivers these in spades.

Linux runs on a wider range of hardware platforms and runs adequately on less costly and powerful systems than other operating systems. Moreover, Linux systems are generally highly reliable.

But this impressive inventory of selling points doesn’t end the matter. Let’s consider some other technical characteristics of Linux that distinguish it from the pack:

Cost

Foremost in the minds of many is the low cost of Linux. Comparable server operating systems can cost more than $100,000. On the other hand, the low cost of Linux makes it practical for use even as a desktop operating system. In that mode, it truly eclipses the competition.

Power

Many desktop systems are employed as servers. Because of its design and heritage, the features and performance of Linux readily outshine those of desktop operating systems used as makeshift servers. For example, Microsoft’s software license for Windows NT/2000/XP restricts the number of authenticated client connections; if you want your Windows NT/2000/XP server to be able to handle 100 authenticated clients, you must pay Microsoft a hefty license fee. However, Linux imposes no such restriction; your Linux desktop or server system is free to accept as many client connections as you think it can handle.

Availability

Properly configured Linux systems are quite stable—almost as stable as the hardware on which they run. Moreover, installing Linux software or updates doesn’t generally require you to reboot your system. So, Linux systems tend to be highly available.

Reliability

Again, because of its design and heritage, Linux provides more reliable data storage than competing desktop operating systems. Most Linux users store their disk data using the ext3 filesystem, which is superior in performance and reliability to filesystems (partition types) provided by Microsoft operating systems, including FAT, FAT32, and NTFS. Of course, if you’re worried about losing a whole disk to hardware failure, you can outfit Linux with the powerful facility known as Redundant Array of Independent Disks (RAID).

Filesystem reliability

Microsoft claims that its NTFS filesystem is so reliable that you’ll probably never need special software tools to recover lost data—truth is, Microsoft provides no such tools. Despite Microsoft’s ambitious claims, some Windows NT users report that NTFS reliability is less than satisfactory.

Here’s a case in point: When my Windows NT workstation crashed a little over a year ago I discovered that its NTFS filesystem was damaged. I searched the Microsoft web site for recovery instructions and tools and found nothing that helped. So I went to my local software store and purchased a third-party disk recovery tool for Windows NT. When I opened the box, I was angered to discover that it supported recovery of FAT and FAT32 data, but not NTFS data.

Eventually, I recovered 95 percent of my data by using a free Linux utility that was able to open the damaged NTFS partition and copy its files. If I’d been without Linux, I’d be without my data.

The command-line interface

If you’re an old computer dog who remembers the days of MS-DOS, you may have a fondness for what’s now called the MS-DOS Prompt window or the Command Line Interface (CLI). However, if you’ve worked exclusively within the Windows point-and-click environment, you may not fully understand what the MS-DOS Prompt window is about. By typing commands in the MS-DOS Prompt window, you can direct the computer to perform a variety of tasks.

For most users, the MS-DOS Prompt is not as convenient as the GUI offered by Windows. That’s because you must know the commands the operating system understands and must type them correctly if you expect the operating system to do your bidding.

However, the MS-DOS Prompt window lets you accomplish tasks that would be cumbersome and time-consuming if performed by pointing and clicking. Linux comes with a similar command interface, known as the shell. But, the word similar fails to do justice to the Linux shell’s capabilities, because the MS-DOS Prompt provides only a fraction of the capabilities provided by the Linux shell.

You may have used the MS-DOS Prompt and, finding it distastefully cumbersome, forever rejected it in favor of pointing and clicking. If so, you’ll be pleasantly surprised to see how easy it is to use the Linux shell. You’ll certainly be pleased—perhaps amazed—by the enormous power it offers. Moreover, you can customize the operation of the Linux shell in an almost limitless number of ways and even choose from among a variety of shells, and automate your work by combining commands into files called shell scripts. You’ll learn more about the Linux shell in Chapter 7.

Developing portable code

If you’re a programmer, you’ll also admire the ease with which it’s possible to develop portable, Unix-compliant software. Linux comes with a suite of software development tools, including an assembler, C/C++ compilers, a make application, and a source code librarian. All of these are freely distributable programs made available under the terms of the GNU GPL.



[1] Recently, some Linux vendors, including Red Hat, have begun bundling services with their Linux distributions. Your service agreement with such a vendor may forbid you to install unlicensed copies of the vendor’s distribution or may impose penalties for doing so. See Section 1.2.7 for more information.

Get Learning Red Hat Enterprise Linux & Fedora, Fourth Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.