September 2017
Beginner to intermediate
270 pages
5h 53m
English
We have covered the forward propagation in detail in Chapter 1, Neural Network and Artificial Intelligence Concepts, and a little about backpropagation using gradient descent. Backpropagation is one of the important concepts for understanding neural networks and it relies on calculus to update the weights and biases in each layer. Backpropagation of errors is similar to learning from mistakes. We correct ourselves in our mistakes (errors) in every iteration, until we reach a point called convergence. The goal of backpropagation is to correct the weights in each layer and minimize the overall error at the output layer.
Neural network learning heavily relies on backpropagation in feed-forward networks. The usual steps ...