September 2017
Beginner
244 pages
5h 20m
English
RNNs are notoriously difficult to be trained. Vanilla RNNs, the kind of RNNs we have been talking about so far, suffer from vanishing and exploding gradients that give erratic results during training. As a result, RNNs have difficulty in learning long-range dependencies. For time series forecasting, going too many timesteps back in the past would be problematic. To address this problem, Long Short Term Memory (LSTM) and Gated Recurrent Unit (GRU), which are special types of RNNs, have been introduced. In this chapter, we will use LSTM and GRU to develop the time series forecasting models. Before this, let's review how RNNs are trained using Backpropagation Through Time (BPTT), a variant of the backpropagation ...