Hyper-parameters can be loosely defined as parameters that do not change during training. For example, the number of layers in an FFNN, the number of neurons in each layer, activation functions, learning rate,1 and so on.
This chapter looks at the problem of finding the best hyper-parameters to get the best results from your models. Doing this is called hyper-parameter tuning. We first describe what a black-box optimization problem is, and how those classes of problems relate to hyper-parameter tuning. We look at the three most common methods for tackling ...