8Elementary Gradient‐Based Optimizers: CSLS and ISD

8.1 Introduction

Regardless of the DV dimension, the negative gradient indicates the direction of steepest decent, which is a logical direction to begin the search. This chapter presents two gradient‐based searches: Cauchy’s sequential line search (CSLS) and incremental steepest descent (ISD). Although the exercises will primarily be in 2‐D applications, these are applicable to N‐D situations. The optimization statement is

(8.1)images

8.2 Cauchy’s Sequential Line Search

Although the objective is to find the true minimum in a multi‐DV application, the Cauchy sequential line search (CSLS) will be applied to univariate single DV searches along a straight line of steepest descent through the multidimensional DV space. At the minimum along one line, the gradient is reevaluated, and a new direction of steepest descent is defined. This sequential univariate search along a line is repeated until convergence.

This is a nested optimization. At each stage, search along the line to find the minimum along the line. That is an optimization. But those local line searches continue until the new search directions do not improve the OF.

First, start at an initial trial solution, and use the negative gradient of the function to define the line of steepest descent. Use any of the several 1‐D algorithms to search for the optimum along the line. This requires ...

Get Engineering Optimization now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.