5 Hyperparameter optimization service
This chapter covers
- Hyperparameters and why they are important
- Two common approaches to hyperparameter optimization (HPO)
- Designing an HPO service
- Three popular HPO libraries: Hyperopt, Optuna, and Ray Tune
In the previous two chapters, we saw how models are trained: a training service manages training processes in a remote compute cluster with given model algorithms. But model algorithms and training services aren’t all there is to model training. There’s one more component we haven’t discussed yet—hyperparameter optimization (HPO). Data scientists often overlook the fact that hyperparameter choices can influence model training results significantly, especially when these decisions can be automated using ...
Get Designing Deep Learning Systems now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.