Chapter 7: Multi-Step Deep Learning Inference Pipeline

Now that we have successfully run HPO (Hyperparameter Optimization) and produced a well-tuned DL model that meets the business requirements, it is time to move to the next step towards using this model for prediction. This is where the model inference pipeline comes into play, where the model is used for predicting or scoring real-world data in production, either in real time or batch mode. However, an inference pipeline usually does not just rely on a single model but needs preprocessing and postprocessing logic that is not necessarily seen during the model development stage. Examples of preprocessing steps include detecting the language locale (English or some other languages) before passing ...

Get Practical Deep Learning at Scale with MLflow now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.