Boosting
One of the most powerful ensemble learning techniques is boosting. It allows complicated models to be generated. In this section, we will utilize XGBoost to model our time series data. As there are many degrees of freedom (hyperparameters) when modeling with XGBoost, we expect some level of fine-tuning to be needed to achieve satisfactory results. By replacing our example's regressor with lr = XGBRegressor(), we can utilize XGBoost and fit it onto our data. This results in an MSE of 19.20 and a Sharpe value of 0.13.
Figure depicts the profits and trades generated by the model. Although the Sharpe value is lower than for other models, we can see that it continues to generate profit, even during periods in which the Bitcoin price drops: ...
Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Read now
Unlock full access