Dhiraj, a data scientist and machine learning evangelist, continues his teaching of machine learning algorithms by going into the decision tree algorithm in this video series. Learn all about this powerful machine learning algorithm in this video series covering these eight topics:
- Introducing Decision Trees. This first video in the decision tree series introduces this powerful yet simple algorithm. The decision tree algorithm is used effectively with a series of conditional control statements like IF-ELSE. Understand key decision tree concepts including root node, decision node, leaf node, parent node, splitting, and pruning. There are two types of decision trees: Decision Tree with Categorical Variable and Decision Tree with Continuous Variable. There are also three techniques to generate a decision tree: ID3 (Iterative Dichotomiser 3), C4.5, and CART (Classification And Regression Trees).
- Decision Tree Advantages and Disadvantages. This second video in the decision tree series covers both the advantages and disadvantages of using decision trees. Advantages include less effort in general for data preparation, no requirement for normalization of data, and missing values do not affect the process of building a decision tree. Disadvantages include difficulty managing volatile data, calculations can be complex, and it can take substantial time to train the model.
- Decision Tree Regression. This third video in the decision tree series explains how to perform decision tree regression. Decision tree regression observes features of an object and trains a model to predict data to produce meaningful continuous output. A discrete target example is predicting weather on a particular day. A continuous target example is predicting profit generated from sales. Understand the difference between decision tree regression and linear regression.
- Decision Tree Classification. This fourth video in the decision tree series focuses on the decision tree classifier. The data set is split into subsets based on an attribute value test, and subsets are continued to be created in a process called recursive partitioning. Understand the difference between decision tree classification and linear regression.
- Decision Tree Information Gain. This fifth video in the decision tree series explains information gain in depth. Information gain is a measure of how much information a feature in a given dataset gives with respect to class. Also learn all about entropy, which plays an essential role in deciding how a decision tree will split data.
- Building a Decision Tree in Python. This sixth video in the decision tree series shows you hands-on how to create a decision tree using Python.
- Decision Tree Prediction. This seventh video in the decision tree series explains how to create sample input for the model, use this sample input to have the model make a prediction, and then compare the precision to the actual output. We will be using the scikit-learn predict() function.
- Decision Tree Evaluation. This eighth video in the decision tree series explains how to evaluate a decision tree model. Once the machine learning model has been evaluated, we can use the feedback to improve the model until our model produces the desired accuracy. We will use a Confusion Matrix to evaluate our decision tree model.