Chapter 8. Going Deeper: Understanding TensorFlow Lite

Underlying all of the machine learning technology that you’ve seen so far in this book is TensorFlow. This is a framework that allows you to architect, train, and test machine learning models; we had an introduction to this in Chapter 1 and Chapter 2.

TensorFlow models are usually not designed for mobile scenarios where one has to consider size, battery consumption, and everything else that can impact the mobile user experience. To that end, TensorFlow Lite was created with two main aims. The first—it could be used to convert existing TensorFlow models into a format that was smaller and more compact, with an eye on optimizing them for mobile. The second is to have an efficient runtime for various mobile platforms that could be used for model inference. In this chapter, we’ll explore TensorFlow Lite and take a deeper look at the tooling that’s available for you to convert models trained with TensorFlow as well as how to use tools to optimize them.

We’ll start with a brief tour of why it’s important, and then we can roll our sleeves up and get down to the bits and bytes…

What Is TensorFlow Lite?

The rationale behind the need for something like TensorFlow Lite was driven by several factors. The first is the explosion in the number of personal devices. Mobile devices running iOS or Android already outnumber traditional desktop or laptops as primary computing devices, and embedded systems outnumber mobile devices. The need for ...

Get AI and Machine Learning for On-Device Development now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.