7Multidimension Application Introduction and the Gradient
7.1 Introduction
Many optimization applications have two or more decision variables. The previous chapters focused on univariate, one‐DV, one‐dimensional (1‐D) applications. Although N‐DV applications are more frequently encountered, univariate searches are important, and they reveal key issues in relatively uncomplicated situations. There are new issues and techniques associated with 2‐D and higher‐dimension applications, and this chapter begins a section related to 2‐D, two‐decision variable, and N‐D applications. Each section of this book on optimization algorithms introduces techniques with 2‐D examples and then extends the optimization techniques to N‐D applications. In 2‐D, we still have the generic optimization concept for maximization:
in which the two DVs, x1, x2, are explicitly revealed. Alternately, it could be stated as a minimization of the negative of the OF and/or by using the vector symbol for the DV set:
In a two‐DV situation, the OF is a response of two variables, J = f(x1,x2), which means that the OF value can be plotted as the third dimensional response to the two DV values.
Figure 7.1 illustrates Function 11 in the 2‐D Optimization Examples file, an exploration of the perpendicular (or normal, ...
Get Engineering Optimization now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.