Chapter 1 Computational Optimization

In this chapter a theoretical introduction of computational optimization is discussed regarding minimization, maximization, constrained and unconstrained optimization, convex and non-convex optimization. Also, several important computational optimization techniques are discussed, such as: Gauss-Newton and quasi-Newton methods, gradient-based methods such as steepest descent, conjugate gradient, and also non-gradient methods such as genetic algorithm and swarm intelligence algorithms. Also, several important optimizers used in machine learning are presented such as: Levenberg–Marquardt algorithm, Scaled Conjugate Gradient, RMSProp, Stochastic Gradient Descent. Furthermore, several important and relatively ...

Get Swarm Intelligence and Evolutionary Computation now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.