November 2019
Intermediate to advanced
296 pages
7h 52m
English
Actually, there are many improved versions of the gradient descent algorithm. The following is the list of optimizers provided by TensorFlow.js:
Basically, they are derived from the original gradient descent algorithm, but the differences exist in the way we calculate the updated value in the training iteration. For example, Momentum is an algorithm that helps us avoid vibration, which tends to happen close to the local minimum and is expected to make convergence faster. The updated value of the momentum algorithm is calculated as follows:
The most notable change exists in the first equation. In addition to the gradient of the current ...
Read now
Unlock full access