December 2018
Beginner to intermediate
684 pages
21h 9m
English
The unrolled computational graph shown in the preceding diagram highlights that the learning process necessarily encompasses all time steps included in a given input sequence. The backpropagation algorithm, which updates the weight parameters based on the gradient of the loss function with respect to the parameters, involves a forward pass from left to right along the unrolled computational graph, followed by a backward pass in the opposite direction.
Just like the backpropagation techniques discussed in Chapter 16, Deep Learning, the algorithm evaluates a loss function to obtain the gradients and update the weight parameters. In the RNN context, backpropagation runs from right to left in the computational graph, ...