February 2018
Intermediate to advanced
378 pages
10h 14m
English
The simplest way to simplify the decision tree is to limit its depth. How deep is it now? You can see 20 splits, or 21 layers, in Figure 2.5. At the same time, we have only three features. There are six of them actually, if we are taking into account one-hot encoded categorical color. Let's limit the maximum depth of the tree aggressively to be comparable with the number of features. tree_model object has a max_depth property, and so we're setting it to be less than the number of features:
In []: tree_model.max_depth = 4
After these manipulations, we can retrain our model and reevaluate its accuracy:
In []: tree_model = tree_model.fit(X_train, y_train) tree_model.score(X_train, y_train) Out[]: 0.90571428571428569 ...
Read now
Unlock full access