Parameter hyper-tuning
Rarely, our first model would be the best we can do. By simply looking at our metrics and accepting the model because it passed our pre-conceived performance thresholds is hardly a scientific method for finding the best model.
A concept of parameter hyper-tuning is to find the best parameters of the model: for example, the maximum number of iterations needed to properly estimate the logistic regression model or maximum depth of a decision tree.
In this section, we will explore two concepts that allow us to find the best parameters for our models: grid search and train-validation splitting.
Grid search
Grid search is an exhaustive algorithm that loops through the list of defined parameter values, estimates separate models, and ...
Get Learning PySpark now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.