April 2019
Intermediate to advanced
426 pages
11h 13m
English
Optimizers help to tweak the model weights optimally in minimizing the loss function. There are several types of optimizers that you may come across in deep learning:
Adam is a popular choice of optimizer, and is seen as a combination of RMSprop and SGD with momentum. It is an adaptive learning rate optimization algorithm, computing individual learning rates for different parameters.