Trained models in Docker containers

In the last section, we looked at creating a REST server for classifying images. In this section, we're going to look at preparing a Docker container to create a reasonable runtime environment for that server. As we look into this, we're going to ask the question: why use Docker to package up our machine learning models? Then, we'll actually investigate model training and then save a trained model for use in the Docker container followed by our server Dockerfile, which will package this all together. Finally, we'll build the Docker container for the reusable runtime of our REST service.

So, why Docker? Fundamentally, it makes your trained model portable. Unlike most of the programs you've created, which ...

Get Hands-On Deep Learning for Images with TensorFlow now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.