9. Hyperparameter tuning and Automated Machine Learning

In the previous chapter, we learned how to train convolutional and more complex deep neural networks (DNNs). When training these models, we are often confronted with complex choices when parametrizing them, involving various parameters such as the number of layers, the order of layers, regularization, batch size, learning rate, the number of epochs, and more. This is not only true for DNNs; the same problem arises with selecting the correct preprocessing steps, features, models, and parameters in statistical ML approaches.

In this chapter, we will take a look at optimizing the training process in order to take away some of those error-prone human choices from machine learning. These ...

Get Mastering Azure Machine Learning now with the O’Reilly learning platform.

O’Reilly members experience live online training, plus books, videos, and digital content from nearly 200 publishers.