O'Reilly logo
live online training icon Live Online training

Practical Evolutionary Computing

enter image description here

Nature-Inspired Algorithms for Optimization Problems

Julian Zucker

Evolutionary computing is a natural framework for addressing multi-objective optimization problems where the quality of a particular solution is evaluated on multiple metrics. Evolutionary computing uses the basic principles of random variance and survival of the fittest to gradually evolve a population of solutions to a problem. For example, in a factory floor environment, evolutionary computing has been used to optimize machine scheduling, inventory management, and transportation logistics aimed at minimizing costs while maximizing on-time delivery and customer satisfaction. Or, imagine a couple trying to plan a wedding and needing to seat their guests at various tables. Their goal is to minimize constraint violations (avoid seating Uncle Joe with Aunt Mary), minimize age variance (don’t seat your friend from college at the kids’ table), and perhaps make sure that certain people sit together (your book club should all sit at the same table). These two problems, despite their differences, can be tackled with the same general problem-solving framework.

This course will teach learners how to identify optimization problems and solve them with evolutionary computing. We will work on two different problems: the N-Queens Problem and a variant of the Traveling Salesperson Problem with multiple objectives. Learners will get a chance to work hands-on and implement their own generic evolutionary computing system. We will explore some of the possible extensions to this system, including parallelizing the system and building a meta-system that optimizes how our first system works. Learners will leave this course with a clearer understanding of optimization problems and a new tool to apply in optimization situations.

What you'll learn-and how you can apply it

By the end of this live, hands-on, online course, you’ll understand:

  • The definition of an optimization problem, and how to identify them in the real world.
  • The design of evolutionary computing systems, and how to improve their performance.

And you’ll be able to:

  • Use evolutionary computing to solve optimization problems.
  • Use evolutionary computing to approximate solutions to hard (NP-complete) problems.

This training course is for you because...

  • You’re a software engineer who wants to be able to solve problems that don’t have known algorithms, or where the known algorithms are too slow.
  • You are often faced with problems with more potential solutions than you have time to try, and you want an algorithm to produce a reasonably good solution for you.


  • Basic or intermediate familiarity with Python
  • Comfort with high-school level algebra

Recommended preparation:

  • Examples and exercises will be provided as notebooks in JupyterHub during class

Recommended follow-up:

About your instructor

  • Julian Zucker is a software engineer at Pivotal where he works on open-source protocol for monitoring cloud computing workloads. Previously, he worked at PayPal as a Business Intelligence Engineer, creating large-scale data processing systems for analytics. He is a core developer on Evvo, an open-source distributed evolutionary computing framework. He has published work on simulations of the evolution of inter-group social interactions, and parallelized the simulations to run a thousand times faster, allowing the exploration of ideas that were never before able to be simulated.


The timeframes are only estimates and may vary according to how the class is progressing

Optimization Problems (10 minutes)

  • Euclidean optimization
  • What is an optimization problem?
  • Optimizing one-dimensional Euclidean functions (like f(x) = …)
  • Optimization methods: grid search optimization, random search, gradient descent
  • Optimizing two-dimensional Euclidean functions (like f(x,y) = …)
  • How those optimization methods look in multiple dimensions
  • Discussion
  • Why are some functions harder to optimize than others?
  • Is it possible to find the optimum of every function?
  • How good is grid search/gradient descent guaranteed to get?
  • Can you think of any other algorithms than the ones we proposed to solve these problems?
  • Presentation: Non-Euclidean optimization
  • Optimizing vectors of floats
  • Example – evolved antennas
  • Example – N-queens solutions
  • Q&A

Introduction to Evolutionary Computing (25 minutes)

  • How does natural selection work?
  • What is evolutionary computing?
  • Requirements: fitness function, initial population, mutator
  • When do you stop?
  • Main loop pseudocode
  • Different representations for the N-queens problem
  • Discussion
  • What are some up/downsides of evolutionary computing over the previous optimization approaches?
  • What are the effects of population size on optimization speed?
  • Is it better to have “smarter” mutators? What if it takes longer for them to run?
  • Exercise
  • Implement the main loop! Use your fitness functions from before, and a provided simple operator that swaps two points.
  • Find a solution to the 8-queens problem, using the code you just wrote.
  • Q&A
  • Break (5 minutes)

Solving the N-Queens Problem (20 minutes)

  • Three different representations for the N-queens problem
  • How to benchmark? Speed to perfect solution works here, but not always
  • Explain the API of the provided benchmarking function
  • Exercise: Benchmark your solution, using the provided library functions
  • Presentation: Performance comparisons
  • Q&A

Multi-Objective Optimization (30 minutes)

  • When is an optimization problem multi-objective?
  • How do you define the objectives?
  • Why can’t you just add the objectives/combine them?
  • How do you delete solutions? Can’t just take “best” now
  • False negative rate/false positive rate/fairness in ML
  • Discussion
  • Discuss some examples of multi-objective problems in real world
  • Is it ever reasonable to combine these different objectives?
  • When might you want to convert a single objective into multiple?
  • Exercise: Fill in the missing elements of the multi-objective optimizer skeleton code
  • Q&A
  • Break (5 minutes)

Bi-Objective Traveling Salesperson Problem (50 Minutes)

  • Regular, one-objective traveling salesperson problem
  • Famously NP-Hard
  • But good algorithms exist – not a good place to use evolutionary computing
  • The bi-objective traveling salesperson problem
  • Introduction of the second objective
  • Similar to time/cost of deliveries
  • Solution representation (tour)
  • How to create initial population
  • Random, greedy, and hybrid approaches
  • Mutation operators
  • Crossover operators
  • Examples of problems that can be converted to Bi-objective traveling salesperson problem, and their applications
  • Introduction to the scaffolding I provide – the fitness functions are pre-defined, as well as a “safety checker” that ensures a tour is valid.
  • Exercise
  • Implement a mutation operator that swaps two random cities in the tour
  • Implement a crossover operator for one-point crossover
  • Apply those to the main loop to evolve TSP solutions.
  • Q&A

Extensions (15 Minutes)

  • Parallelism
  • Constraints on solutions
  • Dynamically create new solutions, instead of initial population
  • Vary population size to maximize diversity when stagnating
  • Logging/tracing to identify good mutation operators
  • Evolving which operators are used (“meta-evolutionary systems”)
  • Other problems
  • Q&A

Final Q&A (5 minutes)