May 2017
Beginner to intermediate
254 pages
6h 24m
English
Now that we have knowledge of the logistic function, it is easy to map it to the algorithm that stems from it. In logistic regression, the function input z becomes the weighted sum of features. Given a data sample x with n features x1, x2, ..., xn (x represents a feature vector and x= (x1, x2, ..., xn)), and weights (also called coefficients) of the model w (w represents a vector (w1, w2, ..., wn)), z and is expressed as follows:
![]()
Or sometimes, the model comes with an intercept (also called bias) w0, the preceding linear relationship becomes as follows:
As for the output y(z) in the range of 0 to 1, in ...