Chapter 8. Going Deeper: Understanding TensorFlow Lite
Underlying all of the machine learning technology that youâve seen so far in this book is TensorFlow. This is a framework that allows you to architect, train, and test machine learning models; we had an introduction to this in Chapter 1 and Chapter 2.
TensorFlow models are usually not designed for mobile scenarios where one has to consider size, battery consumption, and everything else that can impact the mobile user experience. To that end, TensorFlow Lite was created with two main aims. The firstâit could be used to convert existing TensorFlow models into a format that was smaller and more compact, with an eye on optimizing them for mobile. The second is to have an efficient runtime for various mobile platforms that could be used for model inference. In this chapter, weâll explore TensorFlow Lite and take a deeper look at the tooling thatâs available for you to convert models trained with TensorFlow as well as how to use tools to optimize them.
Weâll start with a brief tour of why itâs important, and then we can roll our sleeves up and get down to the bits and bytesâ¦
What Is TensorFlow Lite?
The rationale behind the need for something like TensorFlow Lite was driven by several factors. The first is the explosion in the number of personal devices. Mobile devices running iOS or Android already outnumber traditional desktop or laptops as primary computing devices, and embedded systems outnumber mobile devices. ...
Get AI and Machine Learning for On-Device Development now with the O’Reilly learning platform.
O’Reilly members experience live online training, plus books, videos, and digital content from nearly 200 publishers.