February 2019
Beginner to intermediate
308 pages
7h 42m
English
An optimizer is an algorithm for updating the weights of the neural network in the training process. Optimizers in Keras are based on the gradient descent algorithm, which we have covered in an earlier section.
While we won't cover in detail the differences between each optimizer, it is important to note that our choice of optimizer should depend on the nature of the problem. In general, researchers have found that the Adam optimizer works best for DNNs, while the sgd optimizer works best for shallow neural networks. The Adagrad optimizer is also a popular choice, and it adapts the learning rate of the algorithm based on how frequent a particular set of weights are updated. The main advantage ...