Packaging Machine Learning Models with Docker

Video description

One of the important aspects of MLOps, also known as Machine Learning Operations or Operationalizing Machine learning, is to package ML models. How exactly do you package ML models? In this video I show you exactly what that means, and go through the process of packaging an ONNX model taken from the ONNX Model Zoo. I end up with a docker container that can be shared, exposing an API that is ready to consume and perform live predictions for sentiment analysis.
Topics include:
* What are the concepts behind packaging Machine Learning Models
* Create a sentiment analysis API tool with Flask
* Define dependencies and a Dockerfile for packaging
* Create a container with an ONNX model that can be deployed anywhere with an HTTP API
A few resources that are helpful if you are trying to get started with SBOMs, generating them and using them to capture vulnerabilities:
* The RoBERTa ONNX Model
* Schema labeling concetps for Docker containers
* The Practical MLOps code respository full of examples

Table of contents

  1. Lesson 1
    1. "Ml Model Packaging"

Product information

  • Title: Packaging Machine Learning Models with Docker
  • Author(s): Alfredo Deza, Noah Gift
  • Release date: May 2021
  • Publisher(s): Pragmatic AI Solutions