Chapter 2Optimization algorithms—an overview
In this chapter we look at the panorama of methods that have been developed to try to solve the optimization problems of Chapter 1 before diving into R's particular tools for such tasks. Again, R is in the background. This chapter is an overview to try to give some structure to the subject. I recommend that all novices to optimization at least skim over this chapter to get a perspective on the subject. You will likely save yourself many hours of grief if you have a good sense of what approach is likely to suit your problem.
2.1 Methods that use the gradient
If we seek a single (local) minimum of a function , possibly subject to constraints, one of the most obvious approaches is to compute the gradient of the function and proceed in the reverse direction, that is, proceed “downhill.” The gradient is the -dimensional slope of the function, a concept from the differential calculus, and generally a source of anxiety for nonmathematics students.
Gradient descent is the basis of one of the oldest approaches to optimization, the method of steepest descents (Cauchy, 1848). Let us assume that we are at point (which will be a vector if we have ...
Get Nonlinear Parameter Optimization Using R Tools now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.