June 2020
Intermediate to advanced
364 pages
13h 56m
English
Gradient descent is a widely used first-order optimization problem, and it takes steps in the direction of the negative of the gradient of the function from the point it is currently at until it eventually terminates at the optimal solution.
Imagine you're at a skateboarding park and you have a tennis ball in your hand. You bend down and place the ball on the surface of a ramp and let it go; gravity does its thing and the ball follows the ramp's curvature, finding its way to the bottom. This is the concept behind gradient descent.
In this case, the natural choice for the step is the negative gradient; that is,
. This is known ...