August 2018
Intermediate to advanced
378 pages
9h 9m
English
Dropout is a form of regularization which aims to prevent a model from overfitting. Overfitting is when the model is memorizing parts of the training dataset, but is not as accurate on unseen test data. When you build a model, you can check if overfitting is a problem by looking at the gap between the accuracy on the training set against the accuracy on the test set. If performance is much better on the training dataset, then the model is overfitting. Dropout refers to removing nodes randomly from a network temporarily during training. It is usually only applied to hidden layers, and not input layers. Here is an example of dropout applied to a neural network:
For each forward ...