Skip to Content
Deep Learning with Keras
book

Deep Learning with Keras

by Antonio Gulli, Sujit Pal
April 2017
Intermediate to advanced
318 pages
7h 40m
English
Packt Publishing
Content preview from Deep Learning with Keras

Regularization

Regularization is a way to prevent overfitting. We have already seen examples of usage in Chapter 1, Neural Networks Foundations. Multiple layers have parameters for regularization.  The following is the list of regularization parameters commonly used for dense, and convolutional modules:

  • kernel_regularizer: Regularizer function applied to the weight matrix
  • bias_regularizer: Regularizer function applied to the bias vector
  • activity_regularizer: Regularizer function applied to the output of the layer (its activation)

In addition is possible to use Dropout for regularization and that is frequently a very effective choice

keras.layers.core.Dropout(rate, noise_shape=None, seed=None)

Where:

  • rate: It is a float between 0 and ...
Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Advanced Deep Learning with Keras

Advanced Deep Learning with Keras

Rowel Atienza, Neeraj Verma, Valerio Maggio
Deep Learning with TensorFlow 2 and Keras - Second Edition

Deep Learning with TensorFlow 2 and Keras - Second Edition

Antonio Gulli, Dr. Amita Kapoor, Sujit Pal

Publisher Resources

ISBN: 9781787128422Supplemental Content