Chapter 6: Exploring Multi-Fidelity Optimization

Multi-Fidelity Optimization (MFO) is the fourth of four groups of hyperparameter tuning methods. The main characteristic of this group is that all methods belonging to this group utilize the cheap approximation of the whole hyperparameter tuning pipeline so we can have similar performance results with a much lower computational cost and faster experiment time. This group is suitable when you have a very large model or a very large number of samples, for example, when you are developing a neural-network-based model.

In this chapter, we will discuss several methods in the MFO group, including coarse-to-fine search, successive halving, hyper band, and Bayesian Optimization and Hyperband (BOHB). As ...

Get Hyperparameter Tuning with Python now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.