In the previous chapter, we saw that simple HPO techniques could produce very impressive results. Hyperparameter Optimization not only optimizes a specific model for a dataset but can even construct new architectures. But the fact is that we have used an elementary set of tools for HPO tasks so far. Indeed, up to this point, we have only used the primitive Random Search Tuner and Grid Search Tuner. We learned from the previous chapter that search spaces could contain millions ...
3. Hyperparameter Optimization Under Shell
Get Automated Deep Learning Using Neural Network Intelligence: Develop and Design PyTorch and TensorFlow Models Using Python now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.