November 2019
Intermediate to advanced
296 pages
7h 52m
English
As we saw in the previous chapter, it is necessary to define the optimizer in order to solve problems using a logistic regression model. Here, we need to consider the following:
The optimization algorithm is simple because any optimization algorithm can be used, just like the one we used for the regression problem. We can use the Adam optimizer, just like we did for polynomial regression. The only thing we need to deal with is the loss function. Of course, we can use the mean squared error as well, but we have a better loss function definition for binary classification.
Let's assume that the dataset is , where specifies the class label that the data belongs to. The likelihood ...
Read now
Unlock full access