8

Hyperparameter Optimization

How a Kaggle solution performs is not simply determined by the type of learning algorithm you choose. Aside from the data and the features that you use, it is also strongly determined by the algorithm’s hyperparameters, the parameters of the algorithm that have to be fixed prior to training, and cannot be learned during the training process. Choosing the right variables/data/features is most effective in tabular data competitions; however, hyperparameter optimization is effective in all competitions, of any kind. In fact, given fixed data and an algorithm, hyperparameter optimization is the only sure way to enhance the predictive performance of the algorithm and climb the leaderboard. It also helps in ensembling, ...

Get The Kaggle Book now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.