Summary

In this chapter, we introduced the essentials of machine learning. We started with some easy, but still quite effective, classifiers (linear and logistic regressors, Naive Bayes, and K-Nearest Neighbors). Then, we moved on to the more advanced ones (SVM). We explained how to compose weak classifiers together (ensembles, Random Forests, Gradient Tree Boosting) and touched on three awesome gradient-boosted classifiers: XGboost, LightGBM, and CatBoost. Finally, we had a peek at the algorithms used in big data, clustering, and NLP.

In the next chapter, we are going to introduce you to the basics of visualization with Matplotlib, how to operate EDA with pandas and achieve beautiful visualizations with Seaborn, and how to set up a web server ...

Get Python Data Science Essentials - Third Edition now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.