Summary
At a high level, in this chapter you learned about four of the most popular classifiers out there: k-Nearest Neighbors, logistic regression, decision trees, and random forests. Not only did you learn the basics and mechanics of these four algorithms, but you saw how easy they were to perform in R. Along the way, you learned about confusion matrices, hyper-parameter tuning, and maybe even a few new R incantations.
We also visited some more general ideas; for example, you've expanded your understanding of the bias-variance trade-off, looked at how the GLM can perform great feats, and have become acquainted with ensemble learning and bootstrap aggregation. It's also my hope that you've developed some intuition as to which classifiers ...
Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Read now
Unlock full access