December 2018
Beginner to intermediate
684 pages
21h 9m
English
To get a high-level overview of the feature importance across a number of samples, there are two ways to plot the SHAP values: a simple average across all samples that resembles the global feature-importance measures computed previously (as shown in the left-hand panel of the following screenshot), or a scatter graph to display the impact of every feature for every sample (as shown in the right-hand panel of the following screenshot). They are very straightforward to produce using a trained model of a compatible library and matching input data, as shown in the following code:
# load JS visualization code to notebookshap.initjs()# explain the model's predictions using SHAP valuesexplainer = shap.TreeExplainer(model) ...