Over the past five years, the world has become increasingly mobile. As a result, traditional ways of networking the world have proven inadequate to meet the challenges posed by our new collective lifestyle. If users must be connected to a network by physical cables, their movement is dramatically reduced. Wireless connectivity, however, poses no such restriction and allows a great deal more free movement on the part of the network user. As a result, wireless technologies are encroaching on the traditional realm of “fixed” or “wired” networks. This change is obvious to anybody who drives on a regular basis. One of the “life and death” challenges to those of us who drive on a regular basis is the daily gauntlet of erratically driven cars containing mobile phone users in the driver’s seat.
Wireless connectivity for voice telephony has created a whole new industry. Adding mobile connectivity into the mix for telephony has had profound influences on the business of delivering voice calls because callers could be connected to people, not devices. We are on the cusp of an equally profound change in computer networking. Wireless telephony has been successful because it enables people to connect with each other regardless of location. New technologies targeted at computer networks promise to do the same for Internet connectivity. The most successful wireless data networking technology this far has been 802.11.
In the first edition of this book, I wrote about 802.11 being the tip of the trend in mobile data networking. At the time, 802.11 and third-generation mobile technologies were duking it out for mindshare, but 802.11 has unquestionably been more successful to date.
To dive into a specific technology at this point is getting a bit ahead of the story, though. Wireless networks share several important advantages, no matter how the protocols are designed, or even what type of data they carry.
The most obvious advantage of wireless networking is mobility. Wireless network users can connect to existing networks and are then allowed to roam freely. A mobile telephone user can drive miles in the course of a single conversation because the phone connects the user through cell towers. Initially, mobile telephony was expensive. Costs restricted its use to highly mobile professionals such as sales managers and important executive decision makers who might need to be reached at a moment’s notice regardless of their location. Mobile telephony has proven to be a useful service, however, and now it is relatively common in the United States and extremely common among Europeans.
Likewise, wireless data networks free software developers from the tethers of an Ethernet cable at a desk. Developers can work in the library, in a conference room, in the parking lot, or even in the coffee house across the street. As long as the wireless users remain within the range of the base station, they can take advantage of the network. Commonly available equipment can easily cover a corporate campus; with some work, more exotic equipment, and favorable terrain, you can extend the range of an 802.11 network up to a few miles.
Wireless networks typically have a great deal of flexibility, which can translate into rapid deployment. Wireless networks use a number of base stations to connect users to an existing network. (In an 802.11 network, the base stations are called access points.) The infrastructure side of a wireless network, however, is qualitatively the same whether you are connecting one user or a million users. To offer service in a given area, you need base stations and antennas in place. Once that infrastructure is built, however, adding a user to a wireless network is mostly a matter of authorization. With the infrastructure built, it must be configured to recognize and offer services to the new users, but authorization does not require more infrastructure. Adding a user to a wireless network is a matter of configuring the infrastructure, but it does not involve running cables, punching down terminals, and patching in a new jack.
Flexibility is an important attribute for service providers. One of the markets that many 802.11 equipment vendors have been chasing is the so-called “hot spot” connectivity market. Airports and train stations are likely to have itinerant business travelers interested in network access during connection delays. Coffeehouses and other public gathering spots are social venues in which network access is desirable. Many cafes already offer Internet access; offering Internet access over a wireless network is a natural extension of the existing Internet connectivity. While it is possible to serve a fluid group of users with Ethernet jacks, supplying access over a wired network is problematic for several reasons. Running cables is time-consuming and expensive and may also require construction. Properly guessing the correct number of cable drops is more an art than a science. With a wireless network, though, there is no need to suffer through construction or make educated (or wild) guesses about demand. A simple wired infrastructure connects to the Internet, and then the wireless network can accommodate as many users as needed. Although wireless LANs have somewhat limited bandwidth, the limiting factor in networking a small hot spot is likely to be the cost of WAN bandwidth to the supporting infrastructure.
Flexibility may be particularly important in older buildings because it reduces the need for construction. Once a building is declared historical, remodeling can be particularly difficult. In addition to meeting owner requirements, historical preservation agencies must be satisfied that new construction is not desecrating the past. Wireless networks can be deployed extremely rapidly in such environments because there is only a small wired network to install.
Flexibility has also led to the development of grassroots community networks. With the rapid price erosion of 802.11 equipment, bands of volunteers are setting up shared wireless networks open to visitors. Community networks are also extending the range of Internet access past the limitations for DSL into communities where high-speed Internet access has been only a dream. Community networks have been particularly successful in out-of-the way places that are too rugged for traditional wireline approaches.
Like all networks, wireless networks transmit data over a network medium. The medium is a form of electromagnetic radiation. To be well-suited for use on mobile networks, the medium must be able to cover a wide area so clients can move throughout a coverage area. Early wireless networks used infrared light. However, infrared light has limitations; it is easily blocked by walls, partitions, and other office construction. Radio waves can penetrate most office obstructions and offer a wider coverage range. It is no surprise that most, if not all, 802.11 products on the market use the radio wave physical layer.
Wireless devices are constrained to operate in a certain frequency band. Each band has an associated bandwidth, which is simply the amount of frequency space in the band. Bandwidth has acquired a connotation of being a measure of the data capacity of a link. A great deal of mathematics, information theory, and signal processing can be used to show that higher-bandwidth slices can be used to transmit more information. As an example, an analog mobile telephony channel requires a 20-kHz bandwidth. TV signals are vastly more complex and have a correspondingly larger bandwidth of 6 MHz.
Radio spectrum allocation is rigorously controlled by regulatory authorities through licensing processes. Most countries have their own regulatory bodies, though regional regulators do exist. In the U.S., regulation is done by the Federal Communications Commission (FCC). Many FCC rules are adopted by other countries throughout the Americas. European allocation is performed by the European Radiocommunications Office (ERO). Other allocation work is done by the International Telecommunications Union (ITU). To prevent overlapping uses of the radio waves, frequency is allocated in bands, which are simply ranges of frequencies available to specified applications. Table 1-1 lists some common frequency bands used in the U.S.
C-Band satellite downlink
C-Band Radar (weather)
C-Band satellite uplink
X-Band Radar (police/weather)
Ku-Band Radar (police)
13.4-14 GHz 15.7-17.7 GHz
In Table 1-1, there are three bands labeled ISM, which is an abbreviation for industrial, scientific, and medical. ISM bands are set aside for equipment that, broadly speaking, is related to industrial or scientific processes or is used by medical equipment. Perhaps the most familiar ISM-band device is the microwave oven, which operates in the 2.4-GHz ISM band because electromagnetic radiation at that frequency is particularly effective for heating water.
I pay special attention to the ISM bands in the table because those bands allow license-free operation, provided the devices comply with power constraints. 802.11 operates in the ISM bands, along with many other devices. Common cordless phones operate in the ISM bands as well. 802.11b and 802.11g devices operate within the 2.4 GHz ISM band, while 802.11a devices operate in the 5 GHz band.
The more common 802.11b/g devices operate in S-band ISM. The ISM bands are generally license-free, provided that devices are low-power. How much sense does it make to require a license for microwave ovens, after all? Likewise, you don’t need a license to set up and operate a low-power wireless LAN.
Wireless networks are an excellent complement to fixed networks, but they are not a replacement technology. Just as mobile telephones complement fixed-line telephony, wireless LANs complement existing fixed networks by providing mobility to users. Servers and other data center equipment must access data, but the physical location of the server is irrelevant. As long as the servers do not move, they may as well be connected to wires that do not move. At the other end of the spectrum, wireless networks must be designed to cover large areas to accommodate fast-moving clients. Typical 802.11 access points do not cover large areas, and would have a hard time coping with users on rapidly-moving vehicles.
Traditional network security places a great deal of emphasis on physical security of the network components. Data on the network travels over well-defined pathways, usually of copper or fiber, and the network infrastructure is protected by strong physical access control. Equipment is safely locked away in wiring closets, and set up so that it cannot be reconfigured by users. Basic security stems from the (admittedly marginal) security of the physical layer. Although it is possible to tap or redirect signals, physical access control makes it much harder for an intruder to gain surreptitious access to the network.
Wireless networks have a much more open network medium. By definition, the network medium in a wireless network is not a well-defined path consisting of a physical cable, but a radio link with a particular encoding and modulation. Signals can be sent or received by anybody in possession of the radio techniques, which are of course well known because they are open standards. Interception of data is child’s play, given that the medium is open to anybody with the right network interface, and the network interface can be purchased for less than $50 at your local consumer electronics store. Careful shopping online may get you cards for half of that.
Furthermore, radio waves tend to travel outside their intended location. There is no abrupt physical boundary of the network medium, and the range at which transmissions can be received can be extended with high-gain antennas on either side. When building a wireless network, you must carefully consider how to secure the connection to prevent unauthorized use, traffic injection, and traffic analysis. With the maturation of wireless protocols, the tools to authenticate wireless users and properly encrypt traffic are now well within reach.
Once a wired network is put in place, it tends to be boring, which is to say, predictable. Once the cables have been put in place, they tend to do the same thing day in and day out. Provided the network has been designed according to the engineering rules laid out in the specification, the network should function as expected. Capacity can be added to a wired network easily by upgrading the switches in the wiring closet.
In contrast, the physical medium on wireless LANs is much more dynamic. Radio waves bounce off objects, penetrate through walls, and can often behave somewhat unpredictably. Radio waves can suffer from a number of propagation problems that may interrupt the radio link, such as multipath interference and shadows. Without a reliable network medium, wireless networks must carefully validate received frames to guard against frame loss. Positive acknowledgment, the tactic used by 802.11, does an excellent job at assuring delivery at some cost to throughput.
Radio links are subject to several additional constraints that fixed networks are not. Because radio spectrum is a relatively scarce resource, it is carefully regulated. Two ways exist to make radio networks go faster. Either more spectrum can be allocated, or the encoding on the link can be made more sensitive so that it packs more data in per unit of time. Additional spectrum allocations are relatively rare, especially for license-free networks. 802.11 networks have kept the bandwidth of a station’s radio channel to approximately 30 MHz, while developing vastly improved encoding to improve the speed. Faster coding methods can increase the speed, but do have one potential drawback. Because the faster coding method depends on the receiver to pick out subtle signal differences, much greater signal-to-noise ratios are required. Higher data rates therefore require the station to be located closer to its access point. Table 1-2 shows the standardized physical layers in 802.11 and their respective speeds.
First PHY standard (1997). Featured both frequency-hopping and direct-sequence modulation techniques.
Up to 54 Mbps
Second PHY standard (1999), but products not released until late 2000.
5.5 Mbps 11 Mbps
Third PHY standard, but second wave of products. The most common 802.11 equipment as the first edition of this book was written, and the majority of the legacy installed base at the time the second edition was written.
Up to 54 Mbps
Fourth PHY standard (2003). Applies the coding techniques of 802.11a for higher speed in the 2.4 GHz band, while retaining backwards compatibility with existing 802.11b networks. The most common technology included with laptops in 2005.
Radio is inherently a broadcast medium. When one station transmits, all other stations must listen. Access points act much like old shared Ethernet hubs in that there is a fixed amount of transmission capacity per access point, and it must be shared by all the attached users. Adding capacity requires that the network administrator add access points while simultaneously reducing the coverage area of existing access points.
Many wireless networks are based on radio waves, which makes the network medium inherently open to interception. Properly protecting radio transmissions on any network is always a concern for protocol designers. 802.11 did not build in much in the way of security protocols. Coping with the inherent unreliability of the wireless medium and mobility required several protocol features to confirm frame delivery, save power, and offer mobility. Security was quite far down the list, and proved inadequate in the early specifications.
Wireless networks must be strongly authenticated to prevent use by unauthorized users, and authenticated connections must be strongly encrypted to prevent traffic interception and injection by unauthorized parties. Technologies that offer strong encryption and authentication have emerged since the first edition of this book, and are a major component of the revisions for the second edition.
Wireless networking is a hot industry segment. Several wireless technologies have been targeted primarily for data transmission. Bluetooth is a standard used to build small networks between peripherals: a form of “wireless wires,” if you will. Most people in the industry are familiar with the hype surrounding Bluetooth, though it seems to have died down as real devices have been brought to market. In the first edition, I wrote that I have not met many people who have used Bluetooth devices, but it is much more common these days. (I use a Bluetooth headset on a regular basis.)
Post-second-generation (2.5G) and third-generation (3G) mobile telephony networks are also a familiar wireless technology. They promise data rates of megabits per cell, as well as the “always on” connections that have proven to be quite valuable to DSL and cable modem customers. After many years of hype and press from 3G equipment vendors, the rollout of commercial 3G services is finally underway. 2.5G services like GPRS, EDGE, and 1xRTT are now widely available, and third-generation networks based on UMTS or EV-DO are quickly being built. (I recently subscribed to an unlimited GPRS service to get connected during my train trips between my office and my home.) Many articles quote peak speeds for these technologies in the hundreds of kilobits per second or even megabits, but this capacity must be shared between all users in a cell. Real-world downstream speeds are roughly comparable to dial-up modem connections and cannot touch an 802.11 hot spot.
This is a book about 802.11 networks. 802.11 goes by a variety of names, depending on who is talking about it. Some people call 802.11 wireless Ethernet, to emphasize its shared lineage with the traditional wired Ethernet (802.3). A second name which has grown dramatically in popularity since the first edition of this book is Wi-Fi, from the interoperability certification program run by the Wi-Fi Alliance, the major trade association of 802.11 equipment vendors. The Wi-Fi Alliance, formerly known as the Wireless Ethernet Compatibility Alliance (WECA), will test member products for compatibility with 802.11 standards. Other organizations will perform compatibility testing as well; the University of New Hampshire’s InterOperability Lab (IOL) recently launched a wireless test consortium.
Several standards groups are involved in 802.11-related standardization efforts because 802.11 cuts across many formerly distinct boundaries in networking. Most of the effort remains concentrated in the IEEE, but important contributions to wireless LAN standards have come from several major locations.
The first is the Institute of Electrical and Electronics Engineers (IEEE). In addition to its activities as a professional society, the IEEE works on standardizing electrical equipment, including several types of communication technology. IEEE standardization efforts are organized by projects, each of which is assigned a number. By far the most famous IEEE project is the IEEE 802 project to develop LAN standards. Within a project, individual working groups develop standards to address a particular facet of the problem. Working groups are also given a number, which is written after the decimal point for the corresponding projects. Ethernet, the most widely used IEEE LAN technology, was standardized by the third working group, 802.3. Wireless LANs were the eleventh working group formed, hence the name 802.11.
Within a working group, task groups form to revise particular aspects of the standard or add on to the general area of functionality. Task groups are assigned a letter beneath the working group, and the document produced by a task group combines the project and working group number, followed by the letter from the task group. (Some letters that are subject to easy confusion with letters, such as the lowercase “l,” are not used.) In wireless networking, the first task group to gain wide recognition was Task Group B (TGb), which produced the 802.11b specification. Table 1-3 is a basic listing of the different 802.11 standards.
Interestingly enough, the case of the letter in a standards revision encodes information. Lowercase letters indicate dependent standards that cannot stand alone from their parent, while uppercase letters indicate full-fledged standalone specifications.
802.11b adds a new clause to 802.11, but cannot stand alone, so the “b” is written in lowercase. In contrast, standards like 802.1Q and 802.1X are standalone specifications that are completely self-contained in one document, and therefore use uppercase letters.
At periodic intervals, the additions from dependent task groups will be “rolled up” into the main parent specification. The initial revision of 802.11 came out in 1997. Minor changes to the text were released as 802.11-1999, which was the baseline standard for quite some time. The most recent rollup is 802.11-2003.
First standard (1997). Specified the MAC and the original slower frequency-hopping and direct-sequence modulation techniques.
Second physical layer standard (1999), but products not released until late 2000.
Third physical layer standard (1999), but second wave of products. The most common 802.11 equipment as the first book was written.
Task group that produced a correction to the example encoding in 802.11a. Since the only product was a correction, there is no 802.11c.
Extends frequency-hopping PHY for use across multiple regulatory domains
TGe (future 802.11e)
Task group producing quality-of-service (QoS) extensions for the MAC. An interim snapshot called Wi-Fi Multi-Media (WMM) is likely to be implemented before the standard is complete.
Inter-access point protocol to improve roaming between directly attached access points
Most recently standardized (2003) PHY for networks in the ISM band.
Standard to make 802.11a compatible with European radio emissions regulations. Other regulators have adopted its mechanisms for different purposes.
Improvements to security at the link layer.
Enhancements to 802.11a to conform to Japanese radio emission regulations.
TGk (future 802.11k)
Task group to enhance communication between clients and network to better manage scarce radio use.
Task group to incorporate changes made by 802.11a, 802.11b, and 802.11d, as well as changes made by TGc into the main 802.11 specification. (Think “m” for maintenance.)
TGn (future 802.11n)
Task group founded to create a high-throughput standard. The design goal is throughput in excess of 100 Mbps, and the resulting standard will be called 802.11n.
TGp (future 802.11p)
Task group adopting 802.11 for use in automobiles. The initial use is likely to be a standard protocol used to collect tolls.
TGr (future 802.11r)
Enhancements to roaming performance.
TGs (future 802.11s)
Task group enhancing 802.11 for use as mesh networking technology.
TGT (future 802.11T)
Task group designing test and measurement specification for 802.11. Its result will be standalone, hence the uppercase letter.
TGu (future 802.11u)
Task group modifying 802.11 to assist in interworking with other network technologies.
When it became clear that authentication on wireless networks was fundamentally broken, the IEEE adopted several authentication standards originally developed by the Internet Engineering Task Force (IETF). Wireless LAN authentication depends heavily on protocols defined by the IETF.
The Wi-Fi Alliance is a combination of a trade association, testing organization, and standardization organization. Most of the Wi-Fi Alliance’s emphasis is on acting as a trade association for its members, though it also well-known for the Wi-Fi certification program. Products are tested for interoperability with a testbed consisting of products from major vendors, and products that pass the test suite are awarded the right to use the Wi-Fi mark.
The Wi-Fi Alliance’s standardization efforts are done in support of the IEEE. When the security of wireless networks was called into question, the Wi-Fi Alliance produced an interim security specification called Wi-Fi Protected Access (WPA). WPA was essentially a snapshot of the work done by the IEEE security task group. It is more of a marketing standard than a technical standard, since the technology was developed by the IEEE. However, it serves a role in accelerating the development of secure wireless LAN solutions.
 While most of my colleagues, acquaintances, and family in the U.S. have mobile telephones, it is still possible to be a holdout. In Europe, it seems as if everybody has a mobile phone—one cab driver in Finland I spoke with while writing the first edition of this book took great pride in the fact that his family of four had six mobile telephones!
 This simple example ignores the challenges of scale. Naturally, if the new users will overload the existing infrastructure, the infrastructure itself will need to be beefed up. Infrastructure expansion can be expensive and time-consuming, especially if it involves legal and regulatory approval. However, my basic point holds: adding a user to a wireless network can often be reduced to a matter of configuration (moving or changing bits) while adding a user to a fixed network requires making physical connections (moving atoms), and moving bits is easier than moving atoms.
 Laser light is also used by some wireless networking applications, but the extreme focus of a laser beam makes it suited only for applications in which the ends are stationary. “Fixed wireless” applications, in which lasers replace other access technology such as leased telephone circuits, are a common application.