9Regression Models

This chapter introduces how to use Bayesian methods to build linear regression and binary logistic regression models. Examples and R scripts are provided. We also show how to use a Bayesian regression model to make predictions.

9.1 Linear Regression

Linear regression builds a relationship between a continuous response variable (also called a dependent variable) and one or more predictors (also called independent variables) by fitting a linear equation to observed data.

A simple linear regression model builds a relationship between a continuous response variable and one predictor, and the relationship is a straight line, i.e.

9.1equation

where

  • y is the response (or dependent variable, or predicted variable)
  • x is the predictor (or independent variable, or regressor variable)
  • β0 and β1 are unknown regression coefficients
  • β0 is the intercept, which is the mean of the distribution of y when x equals to zero
  • β1 is the slope coefficient, which indicates the change of the mean of the distribution of y when x changes by a unit
  • ε is the error term, which is assumed to follow a normal distribution with a mean of 0 and unknown variance σ2.

Thus, at each given value of x, the distribution of y has a mean of β0 + β1x and variance σ2. This makes a simple linear regression model easy to use in various fields.

For example, a linear regression equation can be used to establish the ...

Get Practical Applications of Bayesian Reliability now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.