Chapter 19. MCMC
For most of this book we’ve been using grid methods to approximate posterior distributions. For models with one or two parameters, grid algorithms are fast and the results are precise enough for most practical purposes. With three parameters, they start to be slow, and with more than three they are usually not practical.
In the previous chapter we saw that we can solve some problems using conjugate priors. But the problems we can solve this way tend to be the same ones we can solve with grid algorithms.
For problems with more than a few parameters, the most powerful tool we have is MCMC, which stands for “Markov chain Monte Carlo”. In this context, “Monte Carlo” refers to methods that generate random samples from a distribution. Unlike grid methods, MCMC methods don’t try to compute the posterior distribution; they sample from it instead.
It might seem strange that you can generate a sample without ever computing the distribution, but that’s the magic of MCMC.
To demonstrate, we’ll start by solving the World Cup Problem. Yes, again.
The World Cup Problem
In Chapter 8 we modeled goal scoring in football (soccer) as a Poisson process characterized by a goal-scoring rate, denoted .
We used a gamma distribution to represent the prior distribution of , then we used the outcome of the game to compute the posterior distribution for both teams.
To answer the first question, we used the posterior distributions to compute the “probability of superiority” for France. ...
Get Think Bayes, 2nd Edition now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.