Optimizing app delivery with load balancers

Learn how good load balancers can distribute load, add security, and maintain performance and flexibility in web application delivery.

By Derek DeJonghe
March 23, 2017
Hamburg Hamburg (source: janmarcust)

Application delivery has evolved significantly in the cloud era. The industry has moved away from bulky servers hosting monolithic applications to thinly provisioned machines hosting applications that serve only a single purpose. Similarly, traditional approaches to resiliency, like failover and over provisioning, have given way to newer methods like load balancing and on-demand nodes. Load balancing in particular has solved many issues in application delivery, beyond just distributing load: reducing strain on critical servers, creating high availability for redundancy and business continuity, and improving overall system performance.

Load balancers can take the form of hardware, software, or both. The rise of cloud computing has led to an increase in the use of software-based load balancers. This allows new load balancers to be brought online without a costly procurement cycle, and can be installed and run in any environment. With software load balancers you’re not concerned about care and nurturing of the underlying hardware appliance; you’re concerned about this portion of configuration that enables logic-based traffic routing to a dynamic environment. The versatility of this approach promotes healthy workflows between development and operations teams with a solution that they can both work with and maintain.

Learn faster. Dig deeper. See farther.

Join the O'Reilly online learning platform. Get a free trial today and find answers on the fly, or master something new and useful.

Learn more

Let’s take a look at the characteristics of a sophisticated software load balancer, and why these characteristics are important in a load balancing solution for the modern web.

Load balancing and HTTP feature sets

Load balancers are so essential to modern applications that it is difficult to think of an architecture that would not need one. Modern day applications have moved to the web because HTTP is cross platform. I choose to use load balancers that provide me with the power to manipulate HTTP request/response and direct to upstream servers accordingly. While migrating applications to the cloud, I often find myself needing solutions for session persistence, either to support legacy application requirements or for performance reasons. A strong load balancer will have a solution that allows traffic to be routed based on session state identifiers stored in cookies or other headers.

Some load balancers offer advanced features that come in quite handy for ongoing support of these applications, in both traditional operations and when used in a DevOps scenario. I find it useful to use load balancers that offer an API to register new nodes and collect meaningful statistics. Load balancers also need to be able to make intelligent decisions on who to pass traffic to. By running consistent health checks, a strong load balancer will ensure that it’s always serving from a healthy node. I also find that the load balancing layer is a prime place to cache responses from the application server. With as many solutions as there are in this field, it’s important to pick one that’s feature rich, but also mature.

Application security

Being located at the confluence of client and server in web applications, a load balancer can be used as a gateway to quickly and efficiently make decisions on whether a request should be served. Web applications need to be protected from a multitude of attacks and abuse. Between brute force, cross-site scripting, SQL injection, and packet sniffing, there’s a lot to look out for, and a strong load balancer will have an answer for them. From basic network origin-based denials to full-fledged user authentication and authorization, there are many load balancers in the field that can serve these perimeter roles. It’s best to integrate with providers or custom authentication applications that support JSON Web Tokens in a pattern where the load balancer can verify the token for both authentication and service-level authorization. Understanding the possibilities built into your existing network infrastructure enables you to secure web applications inline without adding extra services or dependencies.

Deployment and Operations

It’s easy to become fatigued by all of the options to host applications and services. With a software based load balancer you have more flexibility, as it can provide flexibility and portability to choose between different providers. I’ve built solutions for deploying web applications into some of the top cloud providers, namely Amazon Web Services, Microsoft Azure, Google Cloud Compute, and just about anywhere in Docker containers, and software-based load balancers have always been an elegant solution to deployment and automation.

Once you’ve deployed your load balancers, you’ll need operations processes to manage them. I’ve found that varying configuration management tools and automation solutions help. You need automation to manage large scale deployments. Automation makes changes easier, efficient, and accurate across all nodes your team is responsible for. Configuration management and tools like Consul are a valuable part of advanced operations and deployments. Traditional hardware load balancers that shoehorn their product into a virtual machine often face configuration management issues. This is because they’re purpose-built systems that usually lack the built-in tools and libraries of common Linux operating systems. Software-based load balancers install on top of the common Linux systems, allowing you to bring all of your tools with you.

Making changes to how you deploy is the easy part of operations; knowing what changes to make is more difficult. Logs and metrics help you make calculated decisions about the changes you need to make in order for your application to perform better. When tuning any kind of server, doing experiments is key. First and foremost is taking measurements, second is driving load in an automated fashion.

There are many ways to tune for performance with any load balancer:

  • Buffer responses between your upstream and your client in memory, to avoiding writing to disk
  • Tune the operating system kernel for a high number of connections and keeping those connections open
  • Once you know what the kernel is bottlenecking at, tune for that particular case and run your automated tests again to measure your impact

Conclusion

Load balancers hold a necessary position in a well-architected web application. While there are many choices, hopefully I’ve outlined the important qualities of a good load balancer: it distributes load, adds security to your system, and offers the performance you need while maintaining flexibility in deployment and operations.


This post is part of a collaboration between NGINX and O’Reilly. See our statement of editorial independence.

Post topics: Infrastructure
Share: