O'Reilly logo

An Introduction to Optimization, 4th Edition by Stanislaw H. Zak, Edwin K. P. Chong

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

CHAPTER 7

ONE-DIMENSIONAL SEARCH METHODS

7.1 Introduction

In this chapter, we are interested in the problem of minimizing an objective function f : (i.e., a one-dimensional problem). The approach is to use an iterative search algorithm, also called a line-search method. One-dimensional search methods are of interest for the following reasons. First, they are special cases of search methods used in multivariable problems. Second, they are used as part of general multivariable algorithms (as described later in Section 7.8).

In an iterative algorithm, we start with an initial candidate solution x(0) and generate a sequence of iterates x(1), x(2),…. For each iteration k = 0,1,2,…, the next point x(k+1) depends on x(k) and the objective function f. The algorithm may use only the value of f at specific points, or perhaps its first derivative f′, or even its second derivative f″. In this chapter, we study several algorithms:

Golden section method (uses only f)
Fibonacci method (uses only f)
Bisection method (uses only f′)
Secant method (uses only f′)
Newton’s method (uses f′ and f

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required