Bayesian Statistics: An Introduction, 4th Edition

Book description

Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee's book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques.

This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as well as how it contrasts with the conventional approach. The theory is built up step by step, and important notions such as sufficiency are brought out of a discussion of the salient features of specific examples.

This edition:

  • Includes expanded coverage of Gibbs sampling, including more numerical examples and treatments of OpenBUGS, R2WinBUGS and R2OpenBUGS.

  • Presents significant new material on recent techniques such as Bayesian importance sampling, variational Bayes, Approximate Bayesian Computation (ABC) and Reversible Jump Markov Chain Monte Carlo (RJMCMC).

  • Provides extensive examples throughout the book to complement the theory presented.

  • Accompanied by a supporting website featuring new material and solutions.

More and more students are realizing that they need to learn Bayesian statistics to meet their academic and professional goals. This book is best suited for use as a main text in courses on Bayesian statistics for third and fourth year undergraduates and postgraduate students.

Table of contents

  1. Cover
  2. Title Page
  3. Copyright
  4. Dedication
  5. Preface
  6. Preface to the First Edition
  7. Chapter 1: Preliminaries
    1. 1.1 Probability and Bayes’ Theorem
    2. 1.2 Examples on Bayes’ Theorem
    3. 1.3 Random variables
    4. 1.4 Several random variables
    5. 1.5 Means and variances
    6. 1.6 Exercises on Chapter 1
  8. Chapter 2: Bayesian inference for the normal distribution
    1. 2.1 Nature of Bayesian inference
    2. 2.2 Normal prior and likelihood
    3. 2.3 Several normal observations with a normal prior
    4. 2.4 Dominant likelihoods
    5. 2.5 Locally uniform priors
    6. 2.6 Highest density regions
    7. 2.7 Normal variance
    8. 2.8 HDRs for the normal variance
    9. 2.9 The role of sufficiency
    10. 2.10 Conjugate prior distributions
    11. 2.11 The exponential family
    12. 2.12 Normal mean and variance both unknown
    13. 2.13 Conjugate joint prior for the normal distribution
    14. 2.14 Exercises on Chapter 2
  9. Chapter 3: Some other common distributions
    1. 3.1 The binomial distribution
    2. 3.2 Reference prior for the binomial likelihood
    3. 3.3 Jeffreys’ rule
    4. 3.4 The Poisson distribution
    5. 3.5 The uniform distribution
    6. 3.6 Reference prior for the uniform distribution
    7. 3.7 The tramcar problem
    8. 3.8 The first digit problem; invariant priors
    9. 3.9 The circular normal distribution
    10. 3.10 Approximations based on the likelihood
    11. 3.11 Reference posterior distributions
    12. 3.12 Exercises on Chapter 3
  10. Chapter 4: Hypothesis testing
    1. 4.1 Hypothesis testing
    2. 4.2 One-sided hypothesis tests
    3. 4.3 Lindley’s method
    4. 4.4 Point (or sharp) null hypotheses with prior information
    5. 4.5 Point null hypotheses for the normal distribution
    6. 4.6 The Doogian philosophy
    7. 4.7 Exercises on Chapter 4
  11. Chapter 5: Two-sample problems
    1. 5.1 Two-sample problems – both variances unknown
    2. 5.2 Variances unknown but equal
    3. 5.3 Variances unknown and unequal (Behrens–Fisher problem)
    4. 5.4 The Behrens–Fisher controversy
    5. 5.5 Inferences concerning a variance ratio
    6. 5.6 Comparison of two proportions; the $2\times 2$ table
    7. 5.7 Exercises on Chapter 5
  12. Chapter 6: Correlation, regression and the analysis of variance
    1. 6.1 Theory of the correlation coefficient
    2. 6.2 Examples on the use of the correlation coefficient
    3. 6.3 Regression and the bivariate normal model
    4. 6.4 Conjugate prior for the bivariate regression model
    5. 6.5 Comparison of several means – the one way model
    6. 6.6 The two way layout
    7. 6.7 The general linear model
    8. 6.8 Exercises on Chapter 6
  13. Chapter 7: Other topics
    1. 7.1 The likelihood principle
    2. 7.2 The stopping rule principle
    3. 7.3 Informative stopping rules
    4. 7.4 The likelihood principle and reference priors
    5. 7.5 Bayesian decision theory
    6. 7.6 Bayes linear methods
    7. 7.7 Decision theory and hypothesis testing
    8. 7.8 Empirical Bayes methods
    9. 7.9 Exercises on Chapter 7
  14. Chapter 8: Hierarchical models
    1. 8.1 The idea of a hierarchical model
    2. 8.2 The hierarchical normal model
    3. 8.3 The baseball example
    4. 8.4 The Stein estimator
    5. 8.5 Bayesian analysis for an unknown overall mean
    6. 8.6 The general linear model revisited
    7. 8.7 Exercises on Chapter 8
  15. Chapter 9: The Gibbs sampler and other numerical methods
    1. 9.1 Introduction to numerical methods
    2. 9.2 The EM algorithm
    3. 9.3 Data augmentation by Monte Carlo
    4. 9.4 The Gibbs sampler
    5. 9.5 Rejection sampling
    6. 9.6 The Metropolis–Hastings algorithm
    7. 9.7 Introduction to WinBUGS and OpenBUGS
    8. 9.8 Generalized linear models
    9. 9.9 Exercises on Chapter 9
  16. Chapter 10: Some approximate methods
    1. 10.1 Bayesian importance sampling
    2. 10.2 Variational Bayesian methods: simple case
    3. 10.3 Variational Bayesian methods: general case
    4. 10.4 ABC: Approximate Bayesian Computation
    5. 10.5 Reversible jump Markov chain Monte Carlo
    6. 10.6 Exercises on Chapter 10
  17. Appendix A: Common statistical distributions
    1. A.1 Normal distribution
    2. A.2 Chi-squared distribution
    3. A.3 Normal approximation to chi-squared
    4. A.4 Gamma distribution
    5. A.5 Inverse chi-squared distribution
    6. A.6 Inverse chi distribution
    7. A.7 Log chi-squared distribution
    8. A.8 Student’s t distribution
    9. A.9 Normal/chi-squared distribution
    10. A.10 Beta distribution
    11. A.11 Binomial distribution
    12. A.12 Poisson distribution
    13. A.13 Negative binomial distribution
    14. A.14 Hypergeometric distribution
    15. A.15 Uniform distribution
    16. A.16 Pareto distribution
    17. A.17 Circular normal distribution
    18. A.18 Behrens’ distribution
    19. A.19 Snedecor’s F distribution
    20. A.20 Fisher’s z distribution
    21. A.21 Cauchy distribution
    22. A.22 The probability that one beta variable is greater than another
    23. A.23 Bivariate normal distribution
    24. A.24 Multivariate normal distribution
    25. A.25 Distribution of the correlation coefficient
  18. Appendix B: Tables
  19. Appendix C: R programs
  20. Appendix D: Further reading
    1. D.1 Robustness
    2. D.2 Nonparametric methods
    3. D.3 Multivariate estimation
    4. D.4 Time series and forecasting
    5. D.5 Sequential methods
    6. D.6 Numerical methods
    7. D.7 Bayesian networks
    8. D.8 General reading
  21. References
  22. Index

Product information

  • Title: Bayesian Statistics: An Introduction, 4th Edition
  • Author(s):
  • Release date: September 2012
  • Publisher(s): Wiley
  • ISBN: 9781118332573