October 2015
Beginner to intermediate
300 pages
7h 19m
English
Chapters 1 and 2 hid the inner mechanics of PyMC, and more generally Markov Chain Monte Carlo (MCMC), from the reader. The reason for including this chapter is threefold. First, is that any book on Bayesian inference must discuss MCMC. I cannot fight this. Blame the statisticians. Second, knowing the process of MCMC gives you insight into whether your algorithm has converged. (Converged to what? We’ll get to that.) Third, we’ll understand why we are returned thousands of samples from the posterior as a solution, which at first thought can be odd.
When we set up a Bayesian inference problem with N-unknowns, we are implicitly creating an N-dimensional space for the prior distributions ...