Optimal Control for Chemical Engineers

Book description

This self-contained book gives a detailed treatment of optimal control theory that enables readers to formulate and solve optimal control problems. With a strong emphasis on problem solving, it provides all the necessary mathematical analyses and derivations of important results, including multiplier theorems and Pontryagin's principle. The text presents various examples and basic concepts of optimal control and describes important numerical methods and computational algorithms for solving a wide range of optimal control problems, including periodic processes.

Table of contents

  1. Cover
  2. Half Title
  3. Title Page
  4. Copyright Page
  5. Dedication
  6. Table of Contents
  7. Preface
  8. Notation
  9. 1 Introduction
    1. 1.1 Definition
    2. 1.2 Optimal Control versus Optimization
    3. 1.3 Examples of Optimal Control Problems
      1. 1.3.1 Batch Distillation
      2. 1.3.2 Plug Flow Reactor
      3. 1.3.3 Heat Exchanger
      4. 1.3.4 Gas Diffusion in a Non-Volatile Liquid
      5. 1.3.5 Periodic Reactor
      6. 1.3.6 Nuclear Reactor
      7. 1.3.7 Vapor Extraction of Heavy Oil
      8. 1.3.8 Chemotherapy
      9. 1.3.9 Medicinal Drug Delivery
      10. 1.3.10 Blood Flow and Metabolism
    4. 1.4 Structure of Optimal Control Problems
    5. Bibliography
    6. Exercises
  10. 2 Fundamental Concepts
    1. 2.1 From Function to Functional
      1. 2.1.1 Functional as a Multivariable Function
    2. 2.2 Domain of a Functional
      1. 2.2.1 Linear or Vector Spaces
      2. 2.2.2 Norm of a Function
    3. 2.3 Properties of Functionals
    4. 2.4 Differential of a Functional
      1. 2.4.1 Fréchet Differential
      2. 2.4.2 Gâteaux Differential
      3. 2.4.3 Variation
      4. 2.4.4 Summary of Differentials
      5. 2.4.5 Relations between Differentials
    5. 2.5 Variation of an Integral Objective Functional
      1. 2.5.1 Equivalence to Other Differentials
      2. 2.5.2 Application to Optimal Control Problems
    6. 2.6 Second Variation
      1. 2.6.1 Second Degree Homogeneity
      2. 2.6.2 Contribution to Functional Change
    7. 2.A Second-Order Taylor Expansion
    8. Bibliography
    9. Exercises
  11. 3 Optimality in Optimal Control Problems
    1. 3.1 Necessary Condition for Optimality
    2. 3.2 Application to Simplest Optimal Control Problem
      1. 3.2.1 Augmented Functional
      2. 3.2.2 Optimal Control Analysis
      3. 3.2.3 Generalization
    3. 3.3 Solving an Optimal Control Problem
      1. 3.3.1 Presence of Several Local Optima
    4. 3.4 Sufficient Conditions
      1. 3.4.1 Weak Sufficient Condition
    5. 3.5 Piecewise Continuous Controls
    6. 3.A Differentiability of λ
    7. 3.B Vanishing of (Fy + λGy + λ) at t = 0
    8. 3.C Mangasarian Sufficiency Condition
    9. Bibliography
    10. Exercises
  12. 4 Lagrange Multipliers
    1. 4.1 Motivation
    2. 4.2 Role of Lagrange Multipliers
    3. 4.3 Lagrange Multiplier Theorem
      1. 4.3.1 Generalization to Several Equality Constraints
      2. 4.3.2 Generalization to Several Functions
      3. 4.3.3 Application to Optimal Control Problems
    4. 4.4 Lagrange Multiplier and Objective Functional
      1. 4.4.1 General Relation
    5. 4.5 John Multiplier Theorem for Inequality Constraints
      1. 4.5.1 Generalized John Multiplier Theorem
      2. 4.5.2 Remark on Numerical Solutions
    6. 4.A Inverse Function Theorem
    7. Bibliography
    8. Exercises
  13. 5 Pontryagin’s Minimum Principle
    1. 5.1 Application
    2. 5.2 Problem Statement
      1. 5.2.1 Class of Controls
      2. 5.2.2 New State Variable
      3. 5.2.3 Notation
    3. 5.3 Pontryagin’s Minimum Principle
      1. 5.3.1 Assumptions
      2. 5.3.2 Statement
    4. 5.4 Derivation of Pontryagin’s Minimum Principle
      1. 5.4.1 Pulse Perturbation of Optimal Control
      2. 5.4.2 Temporal Perturbation of Optimal Control
      3. 5.4.3 Effect on Final State
      4. 5.4.4 Choice of Final Costate
      5. 5.4.5 Minimality of the Hamiltonian
      6. 5.4.6 Zero Hamiltonian at Free Final Time
    5. 5.A Convexity of Final States
    6. 5.B Supporting Hyperplane of a Convex Set
    7. Bibliography
  14. 6 Different Types of Optimal Control Problems
    1. 6.1 Free Final Time
      1. 6.1.1 Free Final State
      2. 6.1.2 Fixed Final State
      3. 6.1.3 Final State on Hypersurfaces
    2. 6.2 Fixed Final Time
      1. 6.2.1 Free Final State
      2. 6.2.2 Fixed Final State
      3. 6.2.3 Final State on Hypersurfaces
    3. 6.3 Algebraic Constraints
      1. 6.3.1 Algebraic Equality Constraints
      2. 6.3.2 Algebraic Inequality Constraints
    4. 6.4 Integral Constraints
      1. 6.4.1 Integral Equality Constraints
      2. 6.4.2 Integral Inequality Constraints
    5. 6.5 Interior Point Constraints
    6. 6.6 Discontinuous Controls
    7. 6.7 Multiple Integral Problems
    8. Bibliography
    9. Exercises
  15. 7 Numerical Solution of Optimal Control Problems
    1. 7.1 Gradient Method
      1. 7.1.1 Free Final Time and Free Final State
      2. 7.1.2 Iterative Procedure
      3. 7.1.3 Improvement Strategy
      4. 7.1.4 Algorithm for the Gradient Method
      5. 7.1.5 Fixed Final Time and Free Final State
    2. 7.2 Penalty Function Method
      1. 7.2.1 Free Final Time and Final State on Hypersurfaces
      2. 7.2.2 Free Final Time but Fixed Final State
      3. 7.2.3 Algebraic Equality Constraints
      4. 7.2.4 Integral Equality Constraints
      5. 7.2.5 Algebraic Inequality Constraints
      6. 7.2.6 Integral Inequality Constraints
    3. 7.3 Shooting Newton–Raphson Method
    4. 7.A Derivation of Steepest Descent Direction
    5. 7.A.1 Objective
    6. 7.A.2 Sufficiency Check
    7. Bibliography
    8. Exercises
  16. 8 Optimal Periodic Control
    1. 8.1 Optimality of Periodic Controls
      1. 8.1.1 Necessary Conditions
    2. 8.2 Solution Methods
      1. 8.2.1 Successive Substitution Method
      2. 8.2.2 Shooting Newton–Raphson Method
    3. 8.3 Pi Criterion
    4. 8.4 Pi Criterion with Control Constraints
    5. 8.A Necessary Conditions for Optimal Steady State
    6. 8.B Derivation of Equation (8.12)
    7. 8.C Fourier Transform
    8. 8.D Derivation of Equation (8.25)
    9. Bibliography
    10. Exercises
  17. 9 Mathematical Review
    1. 9.1 Limit of a Function
    2. 9.2 Continuity of a Function
      1. 9.2.1 Lower and Upper Semi-Continuity
    3. 9.3 Intervals and Neighborhoods
    4. 9.4 Bounds
    5. 9.5 Order of Magnitude
      1. 9.5.1 Big-O Notation
    6. 9.6 Taylor Series and Remainder
    7. 9.7 Autonomous Differential Equations
      1. 9.7.1 Non-Autonomous to Autonomous Transformation
    8. 9.8 Differential
    9. 9.9 Derivative
      1. 9.9.1 Directional Derivative
    10. 9.10 Leibniz Integral Rule
    11. 9.11 Newton–Raphson Method
    12. 9.12 Composite Simpson’s 1/3 Rule
    13. 9.13 Fundamental Theorem of Calculus
    14. 9.14 Mean Value Theorem
      1. 9.14.1 For Derivatives
      2. 9.14.2 For Integrals
    15. 9.15 Intermediate Value Theorem
    16. 9.16 Implicit Function Theorem
    17. 9.17 Bolzano–Weierstrass Theorem
    18. 9.18 Weierstrass Theorem
    19. 9.19 Linear or Vector Space
    20. 9.20 Direction of a Vector
    21. 9.21 Parallelogram Identity
    22. 9.22 Triangle Inequality for Integrals
    23. 9.23 Cauchy–Schwarz Inequality
    24. 9.24 Operator Inequality
    25. 9.25 Conditional Statement
    26. 9.26 Fundamental Matrix
    27. Bibliography
  18. Index

Product information

  • Title: Optimal Control for Chemical Engineers
  • Author(s): Simant Ranjan Upreti
  • Release date: April 2016
  • Publisher(s): CRC Press
  • ISBN: 9781000218718