Chapter 2. Understanding Knative Serving

Knative Serving is ideal for running your application services inside Kubernetes by providing a more simplified deployment syntax with automated scale-to-zero and scale-out based on HTTP load. The Knative platform will manage your service’s deployments, revisions, networking, and scaling.

Knative Serving exposes your service via an HTTP URL and has a lot of safe defaults for its configurations. For many practical use cases you might need to tweak the defaults to your needs and might also need to adjust the traffic distribution among the service revisions. Because the Knative Serving Service has the built-in ability to automatically scale down to zero when not in use, it is appropriate to call it a serverless service.

In this chapter, we are going to deploy a Knative Serving Service, see its use of Configuration and Revision, and practice a blue-green deployment and Canary release.

Knative Serving Deployment Model

Before you deploy your first serverless service, it is important that you understand its deployment model and the Kubernetes resources that make up a Knative Service.

During the deployment of a Knative Serving Service (ksvc) as shown in Figure 2-1, the Knative Serving controller creates a Configuration, a Revision, and a Route, which deserve additional explanation:

Knative Configuration

The Knative Configuration maintains the desired state of your deployment, providing a clean separation of code and configuration using the twelve-factor ...

Get Knative Cookbook now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.