Chapter 3Regression techniques
3.1 Introduction: Bayesian regression
Methods for Bayesian estimation of the normal linear regression model, whether with univariate or multivariate outcome, are well established. With an inverse gamma prior on the residual variance in univariate regression, and conjugate normal prior on the regression coefficients (conditional on the residual variance), analytic formulae for the posterior densities of these coefficients and other relevant quantities (e.g. predicted responses for new predictor values) are available. These permit direct estimation with no need for repeated sampling. However, the normal linear regression model is restricted to continuous responses and makes assumptions regarding the error structure, the form of relationship, and the appropriate form of predictors that are not necessarily met in practice. Parameter estimation under alternative assumptions or responses such as heteroscedastic linear regression (Peña et al., 2009), generalised linear models (e.g. Gerwinn et al., 2010), non-linear or varying coefficient relationships (e.g. Blum and François, 2010), and non-conjugate priors (e.g. Fang and Dawid, 2002) are typically facilitated by a sampling based approach to estimation. Similar advantages from iterative sampling apply in assessing the density of model parameters, or structural quantities defined by functions of parameters and data. The Bayesian approach may also be used to benefit with regression model selection, in ...
Get Applied Bayesian Modelling, 2nd Edition now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.