Boosting

Boosting offers an alternative take on the problem of how to combine models together to achieve greater performance. In particular, it is especially suited to weak learners. Weak learners are models that produce an accuracy that is better than a model that randomly guesses, but not by much. One way to create a weak learner is to use a model whose complexity is configurable.

For example, we can train a multilayer perceptron network with a very small number of hidden layer neurons. Similarly, we can train a decision tree, but only allow the tree to comprise a single node, resulting in a single split in the input data. This special type of decision tree is known as a stump.

When we looked at bagging, the key idea was to take a set of random ...

Get Mastering Predictive Analytics with R - Second Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.