Backpropagation, or backprop, is used to efficiently calculate the gradient of the cost function, C. In simple terms, the goal of backprop is to compute the rate of change of the cost, C, with respect to the weights, (), and the biases, ().
In order to clarify the intuition behind backprop, let's consider following deep neural network:
Imagine that we have made a small change, , in the weight value of some weight, , in the network. ...