May 2017
Beginner to intermediate
254 pages
6h 24m
English
After linear regression, the next regression algorithm we will be learning is decision tree regression, which is also called regression tree.
In classification, the decision tree is constructed by recursive binary splitting and growing each node into left and right children. In each partition, it greedily searches for the most significant combination of features and its value as the optimal splitting point. The quality of separation is measured by the weighted purity of labels of two resulting children, specifically via metric Gini impurity or information gain. In regression, the tree construction process is almost identical to the classification one, with only two differences due to the fact that the target becomes ...