Bias-variance trade-off
Errors in machine learning can be decomposed into two components: bias and variance. The difference between them is commonly explained using the shooting metaphor, as demonstrated in the following diagram. If you train a high-variance model on 10 different datasets, the results would be very different. If you train a high-bias model on 10 different datasets, you would get very similar results. In other words, high-bias models tend to underfit and high-variance models tend to overfit. Usually, the more parameters the model has the more it is prone to overfitting, but there are also differences between model classes: parametric models like linear and logistic regressions tend to be biased, while nonparametric models ...
Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Read now
Unlock full access