Construction of random decision tree number 1

We are given six features as input data. Of these, we choose six features at random with a replacement for the construction of this random decision tree:

[['Good', 'Warm', 'Yes'], ['None', 'Warm', 'No'], ['Good', 'Cold', 'No'], ['None', 'Cold', 'No'], ['None', 'Warm', 'No'], ['Small', 'Warm', 'No']]

The remainder of the construction of random decision tree number 1 is similar to the construction of the previous random decision tree, number 0. The only difference is that the tree is built using a different randomly generated subset (as seen previously) of the initial data.

We begin construction with the root node to create the first node of the tree. We would like to add children to the [root] ...

Get Data Science Algorithms in a Week - Second Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.