Chapter 6
Laying Linear Regression Foundations
IN THIS CHAPTER
Performing various tasks with variables
Dealing with probabilities
Considering which features to use
Learning by using Stochastic Gradient Descent (SGD)
The term linear regression may seem complicated, but it’s not, as you see in this chapter. A linear regression is essentially a straight line drawn through a series of x/y coordinates that determine the location of a data point. The data points may not always lie directly on the line, but the line shows where the data points would fall in a perfect world of linear coordinates. By using the line, you can predict a value of y (the criterion variable) given a value of x (the predictor variable). When you have just one predictor variable, you have a simple linear regression. As a contrast, when you have many predictors, you have a multiple linear regression, which doesn’t rely on a line but rather on a plane extending through multiple dimensions. Deep learning uses data inputs to guess the nonlinear plane that will most correctly go through the middle of a set of data points ...