TensorFlow Lite (https://www.tensorflow.org/mobile/tflite) is a lightweight solution that enables running deep learning models on mobile and embedded devices. If a model built in TensorFlow or Keras can be successfully converted to the TensorFlow Lite format, a new model format based on FlatBuffers (https://google.github.io/flatbuffers), which is similar but faster and a lot smaller in size than ProtoBuffers, which we talked about in Chapter 3, Detecting Objects and Their Locations, then you can expect the model to run with low latency and a smaller binary size. The basic workflow of using TensorFlow Lite in your mobile apps is as follows:
- Build and train (or retrain) a TensorFlow model with TensorFlow or Keras ...