Understanding backpropagation

A neural network is considered to be trained when we have some optimal weights of the network so that the network makes good predictions on our data. So, the question is how do we reach these optimal weights? Neural networks are usually trained using a gradient descent algorithm. This might be either the pure gradient descent algorithm or some improved optimization method such as Adam optimizer, which is again based on computing the gradient.

In all of these algorithms, we need to compute the gradient of the loss function relative to all the weights. As a neural network is a complex function, it might not appear to be straightforward. This is where the backpropagation algorithm jumps in, which allows us to calculate ...

Get OpenCV 4 with Python Blueprints - Second Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.