February 2018
Intermediate to advanced
378 pages
10h 14m
English
Dropout is a common way of regularization for deep neural networks. The idea is to turn off random neurons in the previous layer with some predefined probability on each step of the training. The neurons which were turned off are not trained during this step, but will be restored on the next one with the original weights. This technique prevents overfitting because it does not allow to train all the neurons on all the data.
Read now
Unlock full access