April 2017
Intermediate to advanced
318 pages
7h 40m
English
Regularization is a way to prevent overfitting. We have already seen examples of usage in Chapter 1, Neural Networks Foundations. Multiple layers have parameters for regularization. The following is the list of regularization parameters commonly used for dense, and convolutional modules:
In addition is possible to use Dropout for regularization and that is frequently a very effective choice
keras.layers.core.Dropout(rate, noise_shape=None, seed=None)
Where: