Chapter 12. Regression with kNN, random forest, and XGBoost
This chapter covers
- Using the k-nearest neighbors algorithm for regression
- Using tree-based algorithms for regression
- Comparing k-nearest neighbors, random forest, and XGBoost models
You’re going to find this chapter a breeze. This is because you’ve done everything in it before (sort of). In chapter 3, I introduced you to the k-nearest neighbors (kNN) algorithm as a tool for classification. In chapter 7, I introduced you to decision trees and then expanded on this in chapter 8 to cover random forest and XGBoost for classification. Well, conveniently, these algorithms can also be used to predict continuous variables. So in this chapter, I’ll help you extend these skills to solve regression ...
Get Machine Learning with R, the tidyverse, and mlr now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.