Striking a balance

As Figure 9.10 depicts, as a model becomes more complicated/flexible—as it starts to include more and more predictors—the bias of the model continues to decrease. Along the complexity axis, as the model begins to fit the data better and better, the cross-validation error decreases as well. At a certain point, the model becomes overly complex, and begins to fit idiosyncratic noise in the training data set—it overfits!

The cross-validation error begins to climb again, even as the bias of the model approaches its theoretical minimum!

The very left of the plot depicts models with too much bias, but little variance. The right side of the plot depicts models that have very low bias, but very high variance, and thus are useless ...

Get Data Analysis with R - Second Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.