Variational Bayesian Inference for Mixture Models
Variational Bayes (VB) is an approximate method for performing Bayesian inference. It is a non-simulation-based technique and as such it offers a practical and highly time-efficient alternative to using a Markov chain Monte Carlo (MCMC) based approach for Bayesian inference. In the VB approach, an approximation to the true posterior is derived; this results in a set of coupled update expressions for the variational posterior estimates of the model parameters which can be solved iteratively. Variational Bayesian inference has been used by researchers in machine learning since the late 1990s (Attias 1999; Jordan et al. 1999) and there are now many examples of variational approaches in the machine learning literature, for example Bishop (2006) and Winn and Bishop (2005). However, it has only been much more recently that VB has begun to gain popularity among statisticians as a tool for performing Bayesian analysis. Statistical articles have appeared describing the use of VB for mixture modelling (McGrory and Titterington 2007), hidden Markov chain modelling (McGrory and Titterington 2009), hidden Markov random field modelling for spatial data analysis (McGrory et al. 2009) and generalized linear mixed models (Ormerod and Wand 2011). Applications of VB include the analysis ...