Understanding overfitting

The bias-variance trade-off goes hand in hand with a very important problem in machine learning called overfitting. If your model is too simple, it will cause large errors. If it is too complex, it will memorize the data too well. An overfitted model remembers data too well and acts like a database. Suppose that our housing dataset contains some lucky deals where previous houses had a low price because of circumstances not captured in the data. An overfit model will memorize those examples too closely and predict incorrect price values on unseen data.

Now, having understood the error decomposition, can we use it as a stepping stone to design a model-testing pipeline?

We need to determine how to measure model error ...

Get Managing Data Science now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.