TensorFlow uses automatic differentiation—a differentiator is an object that contains all the rules required to build a new graph that takes the derivative of each node it traverses. The tf.train module in TensorFlow 1.x contains the most widely used type of differentiator, called optimizers here. In the module, among the other optimizers, it is possible to find the ADAM optimizer as tf.train.AdamOptimizer and the standard gradient descent optimizer as tf.train.GradientDescentOptimizer. Each optimizer is an object that implements a common interface. The interface standardizes how to use an optimizer to train a model. Performing a mini-batch gradient descent step is just a matter of executing ...
Automatic differentiation – losses and optimizers
Get Hands-On Neural Networks with TensorFlow 2.0 now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.