How the Tests Were Done

Reviewing these 10 top web sites illustrates how performance best practices are followed in real world pages. A problem in doing an analysis of this type is that the subject of the analysis is a moving target—these web sites are constantly changing. For example, during my analysis one web site switched from IIS to Apache. It's possible, and likely, that the page I analyzed is not the one you'll see if you visit that web site today. Ideally, the page you find will implement the suggestions and other best practices highlighted here, and will perform well and load quickly.

The charts of HTTP requests were generated by IBM Page Detailer (http://alphaworks.ibm.com/tech/pagedetailer). This is my preferred packet sniffer. It works across all HTTP clients. I like the way IBM Page Detailer indicates how the HTTP requests are associated to the corresponding HTML document. The HTTP chart makes it easy to identify bottlenecks in component downloads. The bars are color-coded to indicate the type of component being downloaded.

The response times were measured using Gomez's web monitoring services (http://www.gomez.com). The response time is defined as the time from when the request is initiated to when the page's onload event fires. Each URL was measured thousands of times over low broadband (56K-512K); the average value is what is shown here.

I used Firebug (http://www.getfirebug.com) to analyze the JavaScript and CSS in the various pages. Firebug is a critical tool for ...

Get High Performance Web Sites now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.