Chapter 2. Exporting Models

Before delving into model serving, it is necessary to discuss the topic of exporting models. As discussed previously, data scientists define models, and engineers implement model serving. Hence, the ability to export models from data science tools is now important.

For this book, I will use two different examples: Predictive Model Markup Language (PMML) and TensorFlow. Let’s look at the ways in which you can export models using these tools.

TensorFlow

To facilitate easier implementation of model scoring, TensorFlow supports export of the trained models, which Java APIs can use to implement scoring. TensorFlow Java APIs are not doing the actual processing; they are just thin Java Native Interface (JNI) wrappers on top of the actual TensorFlow C++ code. Consequently, their usage requires “linking” the TensorFlow C++ executable to your Java application.

TensorFlow currently supports two types of model export: export of the execution graph, which can be optimized for inference, and a new SavedModel format, introduced this year.

Exporting the Execution Graph

Exporting the execution graph is a “standard” TensorFlow approach to save the model. Let’s take a look at an example of adding an execution graph export to a multiclass classification problem implementation using Keras with a TensorFlow backend applied to an open source wine quality dataset (complete code).

Example 2-1. Exporting an execution graph from a Keras model
... # Create TF session and set ...

Get Serving Machine Learning Models now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.