The scikit-learn library contains the DecisionTreeClassifier class, which can train a Binary Decision Tree with Gini and cross-entropy impurity measures. In our example, let's consider a dataset with 3 features and 3 classes:
from sklearn.datasets import make_classificationnb_samples = 500X, Y = make_classification(n_samples=nb_samples, n_features=3, n_informative=3, n_redundant=0, n_classes=3, n_clusters_per_class=1)
First, let's consider a classification with the default Gini impurity:
from sklearn.tree import DecisionTreeClassifierfrom sklearn.model_selection import cross_val_scoredt = DecisionTreeClassifier()print(cross_val_score(dt, X, Y, scoring='accuracy', cv=10).mean())0.970
A very interesting ...