November 2018
Intermediate to advanced
322 pages
7h 54m
English
Before we look at Adam optimization, let's try to first understand the concept of gradient descent.
Gradient descent is an iterative optimization algorithm to find the minimum of a function. An analogous example could be as follows: let's say we are stuck on somewhere in middle of a mountain and we want to reach the ground in fastest possible manner. As a first step, we will observe the slope of mountain in all directions around us and decide to take the the direction with steepest slope down.
We re-evaluate our choice of direction after every step we take. Also, the size of our walking also depends on the steepness of the downward slope. If the slope is very steep, we take bigger steps as it can help us to ...
Read now
Unlock full access