Chapter 13. Grid Search
In Chapter 12 we demonstrated how users can mark or tag arguments in preprocessing recipes and/or model specifications for optimization using the
tune() function. Once we know what to optimize, it’s time to address the question of how to optimize the parameters. This chapter describes grid search methods that specify the possible values of the parameters a priori. (Chapter 14 will continue the discussion by describing iterative search methods.)
Let’s start by looking at two main approaches for assembling a grid.
Regular and Nonregular Grids
There are two main types of grids. A regular grid combines each parameter (with its corresponding set of possible values) factorially, i.e., by using all combinations of the sets. Alternatively, a nonregular grid is one where the parameter combinations are not formed from a small set of points.
Before we look at each type in more detail, let’s consider an example model: the multilayer perceptron model (a.k.a. single-layer artificial neural network). The parameters marked for tuning are:
The number of hidden units
The number of fitting epochs/iterations in model training
The amount of weight decay penalization
Using parsnip, the specification for a classification model fit using the nnet package is:
trace = 0 prevents extra ...