Until now, we have seen single learning algorithms of growing complexity. Ensembles represent an effective alternative since they tend to achieve better predictive accuracy by combining or chaining the results from different data samples, algorithms settings, and types.
They divide themselves into two branches. According to the method used, they ensemble predictions:
- Averaging algorithms: These predict by averaging the results of various parallel estimators. The variations in the estimators provide further division into four families: pasting, bagging, subspaces, and patches.
- Boosting algorithms: These predict by using a weighted average of sequential aggregated estimators.
Before delving into some examples for both classification ...