Regularization
Recall from the Chapter 2, Machine Learning Basics that overfitting and underfitting can happen when a machine learning model learns it's training dataset too well, or when it doesn't learn it well enough. Artificial neural networks are not immune from this problem! Overfitting often occurs in neural network because the amount of parameters that they have is too large for the training data. In other words, the model is too complex for the amount of data that it is being trained on.
One way that we can prevent overfitting in our networks is through a technique called regularization. Regularization works by shrinking the parameters of a model to create a less-complex model, thereby reducing overfitting. Let's say we have a loss ...
Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Read now
Unlock full access