Jumping from the logistic function to logistic regression

Now that we have some knowledge of the logistic function, it is easy to map it to the algorithm that stems from it. In logistic regression, the function input becomes the weighted sum of features. Given a data sample with n features, x1, x2, …, xn (x represents a feature vector and x = (x1, x2, …, xn)), and weights (also called coefficients) of the model w (w represents a vector (w1, w2, …, wn)), z is expressed as follows:

Also, occasionally, the model comes with an intercept (also called bias), w0. In this instance, the preceding linear relationship becomes:

As for the output ...

Get Python Machine Learning By Example - Second Edition now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.