November 2017
Intermediate to advanced
304 pages
6h 58m
English
In your production server, you need to install TensorFlow Serving and its prerequisites. You can visit the official website of TensorFlow Serving at https://tensorflow.github.io/serving/setup. Next, we will use the standard TensorFlow Model Server provided in TensorFlow Serving to serve the model. First, we need to build the tensorflow_model_server with the following command:
bazel build //tensorflow_serving/model_servers:tensorflow_model_server
Copy all the files from /home/ubuntu/models/pet_model in your training server into your production server. In our setup, we choose /home/ubuntu/productions as our folder to store all the production models. The productions folder will have the following structure:
- /home/ubuntu/productions/ ...
Read now
Unlock full access