July 2017
Beginner to intermediate
715 pages
17h 3m
English
When learning from data, there is always the danger of overfitting. Overfitting occurs when the model starts learning the noise in the data instead of detecting useful patterns. It is always important to check if a model overfits, otherwise it will not be useful when applied to unseen data.
The typical and most practical way of checking whether a model overfits or not is to emulate unseen data, that is, take a part of the available labeled data and do not use it for training.
This technique is called hold out, where we hold out a part of the data and use it only for evaluation.

We also shuffle the original dataset before ...