April 2017
Intermediate to advanced
318 pages
7h 40m
English
Optimizers include SGD, RMSprop, and Adam. We have seen a few examples of optimizers in Chapter 1, Neural Networks Foundations, and more examples (Adagrad and Adadelta; for more information, refer to https://keras.io/optimizers/) will be presented in the next chapters.