Chapter 14. Iterative Search
Chapter 13 demonstrated how grid search takes a predefined set of candidate values, evaluates them, then chooses the best settings. Iterative search methods pursue a different strategy. During the search process, they predict which values to test next.
When grid search is infeasible or inefficient, iterative methods are a sensible approach for optimizing tuning parameters.
This chapter outlines two search methods. First, we discuss Bayesian optimization, which uses a statistical model to predict better parameter settings. After that, the chapter describes a global search method called simulated annealing.
We use the same data on cell characteristics as the previous chapter for illustration but change the model. This chapter uses a support vector machine model because it provides nice two-dimensional visualizations of the search processes.
A Support Vector Machine Model
We once again use the cell segmentation data, described in Chapter 13, for modeling, with a support vector machine (SVM) model to demonstrate sequential tuning methods. See Kuhn and Johnson (2013) for more information on this model. The two tuning parameters to optimize are the SVM cost value and the radial basis function kernel parameter . Both parameters can have a profound effect on the model complexity and performance.
The SVM model uses a dot product and, for this reason, it is necessary to center and scale the predictors. Like the multilayer perceptron model, this model ...