Skip to Main Content
Hands-On Ensemble Learning with R
book

Hands-On Ensemble Learning with R

by Prabhanjan Narayanachar Tattar
July 2018
Beginner to intermediate content levelBeginner to intermediate
376 pages
9h 1m
English
Packt Publishing
Content preview from Hands-On Ensemble Learning with R

Bagging and Random Forests

Chapter 3, Bagging, and Chapter 4, Random Forests, demonstrate how to improve the stability and accuracy of the basic decision tree. In this section, we will primarily use the decision tree as base learners and create an ensemble of trees in the same way that we did in Chapter 3, Bagging, and Chapter 4, Random Forests.

The split function is the primary difference between bagging and random forest algorithms for classification and regression trees. Thus, unsurprisingly, we can continue to use the same functions and packages for the regression problem as the counterparts that were used in the classification problem. We will first use the bagging function from the ipred package to set up the bagging algorithm for the housing ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Hands-On Deep Learning with R

Hands-On Deep Learning with R

Rodger Devine, Michael Pawlus
Learning Bayesian Models with R

Learning Bayesian Models with R

Hari Manassery Koduvely
Advanced Machine Learning with R

Advanced Machine Learning with R

Cory Lesmeister, Dr. Sunil Kumar Chinnamgari

Publisher Resources

ISBN: 9781788624145Supplemental Content