Backpropagation through time

Backpropagation through time is the typical algorithm we use to train recurrent networks (Backpropagation Through Time: What It Does and How to Do It, http://axon.cs.byu.edu/~martinez/classes/678/Papers/Werbos_BPTT.pdf). As the name suggests, it's based on the backpropagation algorithm we discussed in Chapter 1, The Nuts and Bolts of Neural Networks.

The main difference between regular backpropagation and backpropagation through time is that the recurrent network is unfolded through time for a certain number of time steps (as illustrated in the preceding diagram). Once the unfolding is complete, we end up with a model that is quite similar to a regular multi-layer feedforward network, that is, one hidden layer ...

Get Advanced Deep Learning with Python now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.