CHAPTER 14

GLOBAL SEARCH ALGORITHMS

14.1 Introduction

The iterative algorithms in previous chapters, in particular gradient methods, Newton’s method, conjugate gradient methods, and quasi-Newton methods, start with an initial point and then generate a sequence of iterates. Typically, the best we can hope for is that the sequence converges to a local minimizer. For this reason, it is often desirable for the initial point to be close to a global minimizer. Moreover, these methods require first derivatives (and also second derivatives in the case of Newton’s method).

In this chapter we discuss various search methods that are global in nature in the sense that they attempt to search throughout the entire feasible set. These methods use only objective function values and do not require derivatives. Consequently, they are applicable to a much wider class of optimization problems. In some cases, they can also be used to generate “good” initial (starting) points for the iterative methods discussed in earlier chapters. Some of the methods we discuss in this chapter (specifically, the randomized search methods) are also used in combinatorial optimization, where the feasible set is finite (discrete), but typically large.

14.2 The Nelder-Mead Simplex Algorithm

The method originally proposed by Spendley, Hext, and Himsworth [122] in 1962 was improved by Nelder and Mead [97] in 1965 and it is now commonly referred to as the Nelder-Mead simplex algorithm. A contemporary view of the algorithm ...

Get An Introduction to Optimization, 4th Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.