As seen in Chapter 6, Predicting Online Ads Click-through with Tree-Based Algorithms, we introduced random forest as an ensemble learning method by combining multiple decision trees that are separately trained and randomly subsampling training features in each node of a tree. In classification, a random forest makes a final decision by majority vote of all tree decisions. Applied to regression, a random forest regression model (also called regression forest) assigns the average of regression results from all decision trees to the final decision.
Here, we'll use the regression forest package, RandomForestRegressor, from scikit-learn and deploy it to our Boston house price prediction example:
>>> from sklearn.ensemble ...