August 2018
Intermediate to advanced
378 pages
9h 9m
English
Another approach to regularization involves creating multiple models (ensembles) and combining them, such as by model-averaging or some other algorithm for combining individual model results. There is a rich history of using ensemble techniques in machine learning, such as bagging, boosting, and random forest, that use this technique. The general idea is that, if you build different models using the training data, each model has different errors in the predicted values. Where one model predicts too high a value, another may predict too low a value, and when averaged, some of the errors cancel out, resulting in a more accurate prediction than would have been otherwise obtained.
The key to ensemble methods is that ...