June 2022
Intermediate to advanced
600 pages
17h 56m
English
This chapter covers
At this point, we have learned the basics of neural networks and three types of architectures: fully connected, convolutional, and recurrent. These networks have been trained with an approach called stochastic gradient descent (SGD), which has been in use since at least the 1960s. Newer improvements to learning the parameters of our network have been invented since then, like momentum and learning rate decay, which can improve any neural network for any problem ...
Read now
Unlock full access