April 2016
Beginner to intermediate
250 pages
5h 38m
English
In this chapter we used the simple yet powerful Bayesian model, which has a representation as a probabilistic graphical model. We saw a Bayesian treatment of the over-fitting problem with the use of priors, such as the Dirichlet-multinomial and the famous Beta-Binomial model.
The last section introduced another graphical model, which was around before the invention of probabilistic graphical models and is called the Gaussian mixture. It is a very important model to capture data coming from different subsets within the same model. And finally, we saw another application of the EM algorithm: learning such models and finding out the parameters of each Gaussian component.
Of course, the Gaussian mixture is not the only latent variable model; ...