16
Integrating ML Systems in Ecosystems
ML systems have gained a lot of popularity for two reasons – their ability to learn from data (which we’ve explored throughout this book), and their ability to be packaged into web services.
Packaging these ML systems into web services allows us to integrate them into workflows in a very flexible way. Instead of compiling or using dynamically linked libraries, we can deploy ML components that communicate over HTTP protocols using JSON protocols. We have already seen how to use that protocol by using the GPT-3 model that is hosted by OpenAI. In this chapter, we’ll explore the possibility of creating a Docker container with a pre-trained ML model, deploying it, and integrating it with other components.
Get Machine Learning Infrastructure and Best Practices for Software Engineers now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.