Jumping from the logistic function to logistic regression

Now that we have some knowledge of the logistic function, it is easy to map it to the algorithm that stems from it. In logistic regression, the function input z becomes the weighted sum of features. Given a data sample x with n features, x1, x2, …, xn (x represents a feature vector and x = (x1, x2, …, xn)), and weights (also called coefficients) of the model w (w represents a vector (w1, w2, …, wn)), z is expressed as follows:

Also, occasionally, the model comes with an intercept (also called bias), w0. In this instance, the preceding linear relationship becomes:

As for the output ...

Get Python Machine Learning By Example - Second Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.