Running hyperparameter optimization (HPO)

It takes data scientists numerous hours and experiments to arrive at an optimal set of hyperparameters that are required for best model performance. This process is mostly based on trial and error.

Although GridSearch is one of the techniques that is traditionally used by data scientists, it suffers from the curse of dimensionality. For example, if we have two hyperparameters, with each taking five possible values, we're looking at calculating objective function 25 times (5 x 5). As the number of hyperparameters grows, the number of times that the objective function is computed blows out of proportion.

Random Search addresses this issue by randomly selecting values of hyperparameters, without doing ...

Get Hands-On Artificial Intelligence on Amazon Web Services now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.