A Quick History of Analytics
For as long as servers have existed, they’ve generated logfiles. Early on, these logs were just another source of diagnostic data for someone in IT. Each time a server handled a request, it wrote a single line of text to disk. This line contained only a few details about the request, and it followed the Common Log Format, or CLF. It included information about the user (where she connected from) and about the request (the date and time that it occurred, the request itself, the returned HTTP status code, and the byte length of the document or page transferred).
It was only in the mid-1990s that information such as user agents (the browser) and referrer (where a user came from) was added to logfiles. A slightly more detailed version of HTTP records, known as Extended Log Format (ELF), followed in early 1996. ELF added more client and server information.
ELF gave many companies their first glimpse of what was happening on their websites. Web logs were sparse, and focused on the technical side of the web server: which objects were requested, which clients requested them, when they were retrieved, and the HTTP status codes in response to those requests.
At first, web operators parsed these logfiles to find problems, searching for a specific error such as a “404 not found,” which indicated a missing file. They quickly realized, however, that they also wanted aggregate data from the logs, such as how many requests the servers had handled that day.
So coders like ...
Get Complete Web Monitoring now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.