April 2018
Intermediate to advanced
334 pages
10h 18m
English
The gradient descent algorithm is an optimization algorithm to find the minimum of the function using first order derivatives, that is, we differentiate functions with respect to their parameters to first order only. Here, the objective of the gradient descent algorithm would be to minimize the cost function
with regards to
and
.
This approach includes following steps for numerous iterations to minimize :