July 2017
Intermediate to advanced
382 pages
9h 13m
English
All the ensemble methods we have seen so far share a common design philosophy: to fit multiple individual classifiers to the data and incorporate their predictions with the help of some simple decision rule (such as averaging or boosting) into a final prediction.
Stacking ensembles, on the other hand, build ensembles with hierarchies. Here, individual learners are organized into multiple layers where the output of one layer of learners is used as training data for a model at the next layer. This way, it is possible to successfully blend hundreds of different models.
Unfortunately, we won't have the time to discuss stacking ensembles in detail.
However, these models can be very powerful, as seen, for example, ...
Read now
Unlock full access