3Bayesian Computation
This chapter introduces commonly used Bayesian computation methods. For distributions where conjugate priors are not available and for complex Bayesian models, posterior distributions may not be analytically tractable. Nowadays, with computers and Markov chain Monte Carlo (MCMC) algorithms, solving posteriors with various types of prior distributions and building complex models has become feasible. In this chapter, MCMC algorithms, including the Metropolis algorithm and Gibbs sampling, are briefly introduced.
This chapter also introduces a widely used Bayesian computation software, Just Another Gibbs Sampler (JAGS), to provide MCMC sampling of Bayesian posteriors. We will walk through R and JAGS example codes to explain the function of various portions of the codes, including creating and running the model, summarizing posterior samples, and different methods of MCMC chain convergence diagnostics. Lastly, methods for model comparisons are introduced.
3.1 Introduction
In Chapter 2 we introduced conjugate prior distributions, in which cases the corresponding posterior distributions have closed forms. For some distributions, conjugate prior distributions are not available. For example, we would like to estimate both the shape and scale parameters of a two‐parameter Weibull distribution, a popular distribution in reliability applications. Closed‐form solutions for this case are not available.
The Weibull distribution is one of the most commonly used lifetime ...
Get Practical Applications of Bayesian Reliability now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.