© Umberto Michelucci 2022
U. MichelucciApplied Deep Learning with TensorFlow 2https://doi.org/10.1007/978-1-4842-8020-1_5

5. Advanced Optimizers

Umberto Michelucci1  
(1)
Dübendorf, Switzerland
 

In general, optimizers are algorithms that minimize a given function. Remember that training a neural network means simply minimizing the loss function. Chapter 1 looked at the gradient descent optimizer and its variations (mini-batch and stochastic). In this chapter, we look at more advanced and efficient optimizers. We look in particular at Momentum, RMSProp, and Adam. We cover the mathematics behind them and then explain how to implement and use them in Keras.

Available Optimizers in Keras in TensorFlow 2.5

TensorFlow has evolved a lot in the last few years. ...

Get Applied Deep Learning with TensorFlow 2: Learn to Implement Advanced Deep Learning Techniques with Python now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.