Chapter 3. tf.function and AutoGraph

Eager execution is fine for linear algebra experts, but not everybody is, and understanding the tensor API isn’t always straightforward. However, tf.function and AutoGraph are ways to create instructions that the TensorFlow execution engine can consume using ordinary Python code.

Understanding tf.function

Now that we’ve seen the groundbreaking increase in ease of use that eager execution brings to TensorFlow, it’s the perfect time to introduce tf.function, which does its own magic. With tf.function, arbitrary Python code can be transformed into a TensorFlow execution graph by annotating a function.

In other words, write the code you want in pure Python, wrap it into a function, and annotate this function. The result is an object that can be run as a TensorFlow program, without having to write a single line of TensorFlow code.

Too good to be true? Let’s see an example.

In this example, we minimize a nested function. There are two functions: f(x) = x - (6/7) * x-1/7 and g(x) = f(f(f(f(x)))). The goal is to find x such that g(x) equals 0. If we relax the condition a bit, we can simply state that we want to find x such that g gets minimized. Does it ring a bell? This is of course an optimization problem, which we can solve with TensorFlow’s automatic differentiation and gradient descent.


Finding the minimum of g is called the “Turnip Seller Problem.”1 The solution can be simply obtained by setting the first derivative g(x)' to zero. As TensorFlow ...

Get What's New In TensorFlow 2.x? now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.