Chapter 1. What Is Varnish Cache?

Varnish Cache is a so-called reverse caching proxy. It’s a piece of software that you put in front of your web server(s) to reduce the loading times of your website/application/API by caching the server’s output. We’re basically talking about web performance.

In this chapter, I’ll explain why web performance is so important and how Varnish can improve it.

Why Does Web Performance Matter?

Many people underestimate the importance of web performance. The common logic is that if a website performs well when 10 users are accessing it, the site will also be fine when 1 million users want to access it. It only takes one successful marketing campaign to debunk that myth.

Performance and scalability aren’t one and the same. Performance is the raw speed of your website: how many (milli)seconds does it take to load the page? Scalability, on the other hand, is keeping the performance stable when the load increases. The latter is a reason for bigger organizations to choose Varnish. The former applies to everyone, even small projects.

Let’s say your website has about 100 visitors per day. Not that many, right? And the loading time of a page is 1.5 seconds—not great, but not that bad either. Without caching, it might take some time (and money) to reduce that loading time to less than a second. You might refactor your code or optimize your infrastructure. And then you might ask yourself if all of the effort was worth it.

It’s also important to know that web performance is an essential part of the user experience. Want to please your users and ensure they stay on your site? Then make sure your pages are loading fast. Even Google knows this—did you know that Google Search takes the loading times of your website into account when calculating its page rank?

Poor performance will not only hurt your Google ranking, it will also impact your bottom line: people don’t have the patience to wait for slow content and will look for an alternative in a heartbeat. In a heavily saturated market, they’ll probably end up with one of your competitors.

Where Does Varnish Fit In?

With a correctly configured Varnish, you will automatically reduce the loading times of your website without much effort. Given that Varnish is open source and easy to set up, this is a no-brainer.

And if you play your cards right, who knows, maybe your site will become popular one day. The term “viral” comes to mind. If you already have a properly configured Varnish in place, you won’t need to take many more measures.

A lot of people think that Varnish is technology for big projects and large companies—the kind of sites that attract massive amounts of hits. That’s true; these companies do use Varnish. In fact, 13% of the top 10,000 websites rely on Varnish to ensure fast loading times. However, Varnish is also suitable for small and medium-sized projects. Have a look at Chapter 9 to learn about some of the success stories and business use cases.

All that being said, Varnish is not a silver bullet; it is only a part of the stack. Many more components are required to serve pages fast and reliably, even at load. These components, such as the network, server, operating system, web server, and the application runtime, can also fail on you.

The Varnish Cache Open Source Project

Varnish Cache is an open source project written in C. The fact that it’s open source means the code is also available online and the use of Varnish is free of charge.

Varnish Cache is maintained by an active community, led by Poul-Henning Kamp. Although Varnish Cache is “free as in beer,” there’s still a company backing the project and funding most of its development. This company, called Varnish Software, is able to fund the Varnish Cache project by providing training, support, and extra features on top of Varnish.

Note

At the time of writing, the most common version, which we will be covering in this book, is 4.1. Version 5 was released on September 15, 2016. However, this does not mean that this book is outdated. The adoption process for new versions takes a while.

How Does Varnish Work?

Varnish is either installed on web servers or on separate machines. Once installed and started, Varnish will mimic the behavior of the web server that sits behind it. Usually, Varnish listens on TCP port 80, the conventional TCP port that delivers HTTP—unless, of course, Varnish itself sits behind another proxy. Varnish will have one or more backends registered and will communicate with one of these backends in case a result cannot be retrieved from cache.

Varnish will preallocate a chunk of virtual memory and use that to store its objects. The objects contain the HTTP response headers and the payload that it receives from the backend. The objects stored in memory will be served to clients requesting the corresponding HTTP resource. The objects in cache are identified by a hash that, by default, is composed of the hostname (or the IP address if no hostname was specified) and the URL of the request.

Varnish is tremendously fast and relies on pthreads to handle a massive amount of incoming requests. The threading model and the use of memory for storage will result in a significant performance boost of your application. If configured correctly, Varnish Cache can easily make your website 1,000 times faster.

Varnish uses the Varnish Configuration Language (VCL) to control the behavior of the cache. VCL is a domain-specific language that offers hooks to override and extend the behavior of the different states in the Varnish Finite State Machine. These hooks are represented by a set of subroutines that exist in VCL. The subroutines and the VCL code live inside the VCL file. At startup time, the VCL file is read, translated to C, compiled, and dynamically loaded as a shared object.

The VCL syntax is quite extensive, but limited at some point. If you want to extend the behavior even further, you can write custom Varnish modules in C. These modules can contain literally anything you can program in C. This extended behavior is presented through a set of functions. These functions are exposed to VCL and enrich the VCL syntax.

Caching Is Not a Trick

The reality of the matter is that most websites, applications, and APIs are data-driven. This means that their main purpose is to present and visualize data that comes from the database or an external resource (feed, API, etc.). The majority of the time is spent on retrieving, assembling, and visualizing data.

When you don’t cache, that process is repeated upon every client request. Imagine how many resources are wasted by recomputing, even though the data hasn’t changed.

If you decide to cache a computed result, you better have good control over the original data. If the original data does change, you will need to make sure the cache is updated. However, emptying the cache too frequently defies the purpose of the cache. It’s safe to say that caching is a balancing act between serving up-to-date data and ensuring acceptable loading times.

Caching is not a trick, and it’s not a way to compensate for poor performing systems or applications; caching is an architectural decision that, if done right, will increase efficiency and reduce infrastructure cost.

Conclusion

Slow websites suck. Users don’t have much patience and in a highly-saturated market, having a fast website can give you the edge over your competitors. Raw performance is important, but a stable time to first byte under heavy load is just as important. We call this scalability and it’s a tough nut to crack. There are plenty of ways to make your website scale, most of which require a considerable amount of time and money. Luckily, a decent caching strategy can reduce the impact of all that traffic. Varnish is a tool that can cache your HTTP traffic and take most of the load off your servers.

Get Getting Started with Varnish Cache now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.