April 2015
Intermediate to advanced
1062 pages
40h 35m
English
In many everyday machine learning applications involving multivariate statistical modeling, even simple inference tasks can easily become computationally intractable. Graph theory has proved a powerful and elegant tool that has extensively been used in optimization and computational theory. This chapter is the first of two chapters dedicated to probabilistic graphical models. Bayesian networks are introduced and the Markov condition is discussed. The d-separation property of Bayesian networks is reviewed. Markov random fields and conditional Markov fields are presented. Factor graphs are defined and emphasis is given in developing message-passing algorithms for chains and trees.