Skip to Content
Generative Deep Learning, 2nd Edition
book

Generative Deep Learning, 2nd Edition

by David Foster
April 2023
Intermediate to advanced
456 pages
11h 12m
English
O'Reilly Media, Inc.
Book available
Content preview from Generative Deep Learning, 2nd Edition

Chapter 7. Energy-Based Models

Energy-based models are a broad class of generative model that borrow a key idea from modeling physical systems—namely, that the probability of an event can be expressed using a Boltzmann distribution, a specific function that normalizes a real-valued energy function between 0 and 1. This distribution was originally formulated in 1868 by Ludwig Boltzmann, who used it to describe gases in thermal equilibrium.

In this chapter, we will see how we can use this idea to train a generative model that can be used to produce images of handwritten digits. We will explore several new concepts, including contrastive divergence for training the EBM and Langevin dynamics for sampling.

Introduction

We will begin with a short story to illustrate the key concepts behind energy-based models.

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Grokking Deep Learning

Grokking Deep Learning

Andrew W. Trask

Publisher Resources

ISBN: 9781098134174Errata PageSupplemental Content