In this chapter, we are interested in the problem of minimizing an objective function *f* : → (i.e., a one-dimensional problem). The approach is to use an iterative search algorithm, also called a line-search method. One-dimensional search methods are of interest for the following reasons. First, they are special cases of search methods used in multivariable problems. Second, they are used as part of general multivariable algorithms (as described later in Section 7.8).

In an iterative algorithm, we start with an initial candidate solution *x*^{(0)} and generate a sequence of *iterates x*^{(1)}, *x*^{(2)},…. For each iteration *k* = 0,1,2,…, the next point *x*^{(k+1)} depends on *x*^{(k)} and the objective function *f*. The algorithm may use only the value of *f* at specific points, or perhaps its first derivative *f*′, or even its second derivative *f*″. In this chapter, we study several algorithms:

Golden section method (uses only *f*)

Fibonacci method (uses only *f*)

Bisection method (uses only *f*′)

Secant method (uses only *f*′)

Newton’s method (uses *f*′ and *f*

Start Free Trial

No credit card required