CHAPTER 8Deploying AI Models as Microservices
In the previous chapter, we talked about Cloud computing, containers, and microservices. We saw how Kubernetes extends beyond a Container‐as‐as‐Service (CaaS) platform into a full ecosystem for deploying software applications packaged as microservices. We also saw an example of deploying an application on Kubernetes by using abstractions like pods, deployments, and services.
In this chapter, we get into some more details of building applications using Kubernetes. We build a simple web application using Python, package it as a Docker container, and deploy to a Kubernetes cluster. Then we modify this application to actually invoke a Deep Learning model and show the results on a web page. Here we start connecting the Keras and Kubernetes worlds together. We see how to build production‐quality Deep Learning applications, thus combining the best of these two technologies.
Building a Simple Microservice with Docker and Kubernetes
Let's get started by building a simple microservice application and then packaging it into a container. The idea of microservices is that the application is self‐contained so it can be deployed and scaled independently as a container instance. First, our application will only show a simple message by reading a text string. We will later do some processing on that text string.
We will use Python to build this web application. Python was traditionally used more for scripting and data science applications. However, ...
Get Keras to Kubernetes now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.