Nature-Inspired Optimization Algorithms

Book description

Nature-Inspired Optimization Algorithms provides a systematic introduction to all major nature-inspired algorithms for optimization. The book's unified approach, balancing algorithm introduction, theoretical background and practical implementation, complements extensive literature with well-chosen case studies to illustrate how these algorithms work. Topics include particle swarm optimization, ant and bee algorithms, simulated annealing, cuckoo search, firefly algorithm, bat algorithm, flower algorithm, harmony search, algorithm analysis, constraint handling, hybrid methods, parameter tuning and control, as well as multi-objective optimization.

This book can serve as an introductory book for graduates, doctoral students and lecturers in computer science, engineering and natural sciences. It can also serve a source of inspiration for new applications. Researchers and engineers as well as experienced experts will also find it a handy reference.

  • Discusses and summarizes the latest developments in nature-inspired algorithms with comprehensive, timely literature
  • Provides a theoretical understanding as well as practical implementation hints
  • Provides a step-by-step introduction to each algorithm

Table of contents

  1. Cover image
  2. Title page
  3. Table of Contents
  4. Copyright
  5. Preface
  6. 1: Introduction to Algorithms
    1. 1.1 What is an Algorithm?
    2. 1.2 Newton’s Method
    3. 1.3 Optimization
    4. 1.4 Search for Optimality
    5. 1.5 No-Free-Lunch Theorems
    6. 1.6 Nature-Inspired Metaheuristics
    7. 1.7 A Brief History of Metaheuristics
  7. 2: Analysis of Algorithms
    1. 2.1 Introduction
    2. 2.2 Analysis of Optimization Algorithms
    3. 2.3 Nature-Inspired Algorithms
    4. 2.4 Parameter Tuning and Parameter Control
    5. 2.5 Discussions
    6. 2.6 Summary
  8. 3: Random Walks and Optimization
    1. 3.1 Random Variables
    2. 3.2 Isotropic Random Walks
    3. 3.3 Lévy Distribution and Lévy Flights
    4. 3.4 Optimization as Markov Chains
    5. 3.5 Step Sizes and Search Efficiency
    6. 3.6 Modality and Intermittent Search Strategy
    7. 3.7 Importance of Randomization
    8. 3.8 Eagle Strategy
  9. 4: Simulated Annealing
    1. 4.1 Annealing and Boltzmann Distribution
    2. 4.2 Parameters
    3. 4.3 SA Algorithm
    4. 4.4 Unconstrained Optimization
    5. 4.5 Basic Convergence Properties
    6. 4.6 SA Behavior in Practice
    7. 4.7 Stochastic Tunneling
  10. 5: Genetic Algorithms
    1. 5.1 Introduction
    2. 5.2 Genetic Algorithms
    3. 5.3 Role of Genetic Operators
    4. 5.4 Choice of Parameters
    5. 5.5 GA Variants
    6. 5.6 Schema Theorem
    7. 5.7 Convergence Analysis
  11. 6: Differential Evolution
    1. 6.1 Introduction
    2. 6.2 Differential Evolution
    3. 6.3 Variants
    4. 6.4 Choice of Parameters
    5. 6.5 Convergence Analysis
    6. 6.6 Implementation
  12. 7: Particle Swarm Optimization
    1. 7.1 Swarm Intelligence
    2. 7.2 PSO Algorithm
    3. 7.3 Accelerated PSO
    4. 7.4 Implementation
    5. 7.5 Convergence Analysis
    6. 7.6 Binary PSO
  13. 8: Firefly Algorithms
    1. 8.1 The Firefly Algorithm
    2. 8.2 Algorithm Analysis
    3. 8.3 Implementation
    4. 8.4 Variants of the Firefly Algorithm
    5. 8.5 Firefly Algorithms in Applications
    6. 8.6 Why the Firefly Algorithm is Efficient
  14. 9: Cuckoo Search
    1. 9.1 Cuckoo Breeding Behavior
    2. 9.2 Lévy Flights
    3. 9.3 Cuckoo Search
    4. 9.4 Why Cuckoo Search is so Efficient
    5. 9.5 Global Convergence: Brief Mathematical Analysis
    6. 9.6 Applications
  15. 10: Bat Algorithms
    1. 10.1 Echolocation of Bats
    2. 10.2 Bat Algorithms
    3. 10.3 Implementation
    4. 10.4 Binary Bat Algorithms
    5. 10.5 Variants of the Bat Algorithm
    6. 10.6 Convergence Analysis
    7. 10.7 Why the Bat Algorithm is Efficient
    8. 10.8 Applications
  16. 11: Flower Pollination Algorithms
    1. 11.1 Introduction
    2. 11.2 Flower Pollination Algorithm
    3. 11.3 Multi-Objective Flower Pollination Algorithms
    4. 11.4 Validation and Numerical Experiments
    5. 11.5 Applications
    6. 11.6 Further Research Topics
  17. 12: A Framework for Self-Tuning Algorithms
    1. 12.1 Introduction
    2. 12.2 Algorithm Analysis and Parameter Tuning
    3. 12.3 Framework for Self-Tuning Algorithms
    4. 12.4 A Self-Tuning Firefly Algorithm
    5. 12.5 Some Remarks
  18. 13: How to Deal with Constraints
    1. 13.1 Introduction and Overview
    2. 13.2 Method of Lagrange Multipliers
    3. 13.3 KKT Conditions
    4. 13.4 Penalty Method
    5. 13.5 Equality with Tolerance
    6. 13.6 Feasibility Rules and Stochastic Ranking
    7. 13.7 Multi-objective Approach to Constraints
    8. 13.8 Spring Design
    9. 13.9 Cuckoo Search Implementation
  19. 14: Multi-Objective Optimization
    1. 14.1 Multi-Objective Optimization
    2. 14.2 Pareto Optimality
    3. 14.3 Weighted Sum Method
    4. 14.4 Utility Method
    5. 14.5 The -Constraint Method
    6. 14.6 Metaheuristic Approaches
    7. 14.7 NSGA-II
  20. 15: Other Algorithms and Hybrid Algorithms
    1. 15.1 Ant Algorithms
    2. 15.2 Bee-Inspired Algorithms
    3. 15.3 Harmony Search
    4. 15.4 Hybrid Algorithms
    5. 15.5 Final Remarks
  21. Appendix A: Test Function Benchmarks for Global Optimization
  22. Appendix B: Matlab Programs
    1. B.1 Simulated Annealing
    2. B.2 Particle Swarm Optimization
    3. B.3 Differential Evolution
    4. B.4 Firefly Algorithm
    5. B.5 Cuckoo Search
    6. B.6 Bat Algorithm
    7. B.7 Flower Pollination Algorithm

Product information

  • Title: Nature-Inspired Optimization Algorithms
  • Author(s): Xin-She Yang
  • Release date: February 2014
  • Publisher(s): Elsevier
  • ISBN: 9780124167452