Chapter 15

Probabilistic Graphical Models

Part I


In many everyday machine learning applications involving multivariate statistical modeling, even simple inference tasks can easily become computationally intractable. Graph theory has proved a powerful and elegant tool that has extensively been used in optimization and computational theory. This chapter is the first of two chapters dedicated to probabilistic graphical models. Bayesian networks are introduced and the Markov condition is discussed. The d-separation property of Bayesian networks is reviewed. Markov random fields and conditional Markov fields are presented. Factor graphs are defined and emphasis is given in developing message-passing algorithms for chains and trees.

Get Machine Learning now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.