Chapter 16

Probabilistic Graphical Models

Part II

Abstract

This chapter is the second one dealing with probabilistic graphical models. Junction trees are first reviewed and a message-passing algorithm for such structures is developed. Then, the focus turns on approximate inference techniques on graphical models, based on variational methods, both for local as well as global approximation. Dynamic graphical models are discussed with an emphasis on HMMs. Inference and training of HMMs is viewed as a special case of the EM algorithm and the message-passing rationale. The Baum-Welch and the Viterbi algorithms are derived. Finally, some extensions, including factorial HMMs and time-varying dynamic Bayesian networks are presented. A discussion ...

Get Machine Learning now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.