9. Interpreting a Machine Learning Model
Overview
This chapter will show you how to interpret a machine learning model's results and get deeper insights into the patterns it found. By the end of the chapter, you will be able to analyze weights from linear models and variable importance for RandomForest. You will be able to implement variable importance via permutation to analyze feature importance. You will use a partial dependence plot to analyze single variables and make use of the lime package for local interpretation.
Introduction
In the previous chapter, you saw how to find the optimal hyperparameters of some of the most popular Machine Learning algorithms in order to get better predictive performance (that is, more accurate predictions). ...
Get The Data Science Workshop now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.