Chapter 4: Exploring Bayesian Optimization
Bayesian optimization (BO) is the second out of four groups of hyperparameter tuning methods. Unlike grid search and random search, which are categorized as uninformed search methods, all of the methods that belong to the BO group are categorized as informed search methods, meaning they are learning from previous iterations to (hopefully) provide a better search space in the future.
In this chapter, we will discuss several methods that belong to the BO group, including Gaussian process (GP), sequential model-based algorithm configuration (SMAC), Tree-structured Parzen Estimators (TPE), and Metis. Similar to Chapter 3, Exploring Exhaustive Search, we will discuss the definition of each method, the differences ...
Get Hyperparameter Tuning with Python now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.