Skip to Content
Bayesian Statistics: An Introduction, 4th Edition
book

Bayesian Statistics: An Introduction, 4th Edition

by Peter M. Lee
September 2012
Intermediate to advanced
486 pages
10h 41m
English
Wiley
Content preview from Bayesian Statistics: An Introduction, 4th Edition

9.3 Data augmentation by Monte Carlo

9.3.1 The genetic linkage example revisited

We can illustrate this technique with the example on genetic linkage we considered in connection with the EM algorithm. Recall that we found that likelihood of the augmented data was

Unnumbered Display Equation

In this method, we suppose that at each stage we have a ‘current’ distribution for η, which initially is the prior distribution. At all stages, this has to be a proper distribution, so we may as well take our prior as the uniform distribution Be(1, 1), which in any case differs little from the reference prior Be(0, 0). At the tth stage in the imputation step, we pick m possible values  , … ,  of η by some (pseudo-) random mechanism with the current density, and then for each of these values of  we generate a value for the augmented data  , which in the particular example simply means picking a value y(i)1 with a binomial distribution of index x1 and parameter . Since we had a Be(1, 1) prior, this gives a posterior

In the posterior ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Bayesian Data Analysis, Third Edition, 3rd Edition

Bayesian Data Analysis, Third Edition, 3rd Edition

Andrew Gelman, John B. Carlin, Hal S. Stern, David B. Dunson, Aki Vehtari, Donald B. Rubin
Introduction to Probability

Introduction to Probability

Joseph K. Blitzstein, Jessica Hwang

Publisher Resources

ISBN: 9781118359778Purchase book