Scales
Scales (source: Jollycat)

High volume web sites that offload scale to the frontend using techniques like leveraging edge caching with a partner content delivery network (CDN) see many benefits, including better performance and a much simpler, more resilient, and potentially cheaper infrastructure to maintain. But one of the main questions I get when talking about this philosophy with folks is: What about security? How do you securely handle things like authorization to APIs or prevent eavesdropping and altering of data transmission when your application mainly lives on the client-side?

It’s easy to think that a mostly client-side site can’t be secure. You may feel compelled to maintain a user session server-side, or think you need a middleware tier to host your secret tokens. My own portfolio is full of sites that have a middleware layer for just this reason. But the truth is that you don’t have to sacrifice security for performance.

Building a secure architecture

Earlier this year, when my team and I set out to re-architect our company’s bill pay site, I brought the same “security second” mindset. So we built a site that was mostly client-side because it’s so much faster, and is a much more elegant solution to scale on the frontend but with a very thin middleware layer to securely interface with the backend APIs that were available to us.

Our architecture looked very much like Figure 1 below:

  • A frontend application that loaded the base page and all of the static content from an edge cache via our CDN. We made the frontend its own application with its own build process and node clusters, in part because it was cached so heavily at our CDN that we only needed a minimal amount of origin nodes to serve this.
  • A middleware tier exposed via RESTful APIs that our frontend communicated with, and that held all of our keys to communicate to our backend APIs. Because the purpose of our middleware was mainly to communicate personalized information we implemented minimal caching, which led us to setting up a fair amount of virtual machines to serve this tier.
  • The backend APIs that tie directly into the company’s infrastructural services, like bill pay.

Figure 1 - Our initial architecture consisting of a frontend that is served up via our CDN’s edge network, calling our middleware layer, which in turn calls the shared backend APIs.

This worked and it was blazing fast. As long as the supporting APIs maintained their SLAs we achieved sub-second initial page load, which was a huge success because the legacy site that we replaced took tens of seconds for page load. This was achieved, in good part, by moving all of the functionality that was previously server-side to be loaded asynchronously from the client-side. Some calls were several seconds long just because they were expensive and involved. Previously that would have blocked the whole page from being returned to the client, but with our new architecture it just prevented a single module on the page from loading; the rest of the page was still usable.

Continual improvement

But my team and I adhere to the philosophy of kaizen—the drive to continually increase efficiency. Not long after our launch my team came to me and said that they wanted to do away with the middle tier, that ultimately we could securely communicate to the backend APIs directly. That would save us from having to maintain and update all those nodes that the middleware tier take up, save us the operating expense of running application performance management and monitoring tools on those nodes, and save us time for each deployment.

“But what about security?” I asked. They patiently walked me through their idea, and had our security group verify it. Essentially the findings were as follows.

Using asymmetric cryptography to encrypt form data that users enter via JavaScript on the client-side and sending the data to the backend securely via TLS would adhere to our security requirements.

If the data was encrypted client-side, even if the transmission was intercepted, it would be unreadable to a third party who did not have access to the keys to decrypt it.

As for authorization, we leveraged OAuth 2 to provide access to our backend APIs from the frontend. As long as the web servers exposing the backend APIs supported CORS (cross origin resource sharing) and our client domain names were allowed, we would be able to securely access them.

The architecture resembled Figure 2, and the user flow would be like so:

  • User enters data into form (maybe name, address, and credit card number) and hits submit.
  • A JavaScript crypto library is used to encrypt the form data, using an RSA public key provided by the backend API.
  • The encrypted data is sent securely over HTTPS to the backend API.
  • The backend API decrypts the data and does whatever it needs to with the data, behind our network.

Figure 2 - Our streamlined architecture, leveraging a JavaScript crypto library to encrypt user-entered data before sending it over TLS to or back-end APIs.

Keep in mind that this solution is for a site that accepts bill payments, so it has some pretty strong security requirements. If your site has no such requirements, you most likely won’t need to encrypt content before sending to the backend.

Three steps to more secure scaling

You should be able to securely implement a blazing fast high traffic website that’s scaled at the frontend by taking these measures:

  • Encrypt data client-side before transmitting it to the server
  • Run all transmissions over HTTPS using the latest version of TLS
  • Leverage application authorization protocols such as OAuth 2 and allow access to backend services from approved domains using CORS

For more information on leveraging cache to scale at the frontend, check out my book Intelligent Caching.


This article is a collaboration between StackPath and O’Reilly. See our statement of editorial independence.

Article image: Scales (source: Jollycat).