O'Reilly logo

Practical Time Series Analysis by Dr. PKS Prakash, Dr. Avishek Pal

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Training recurrent neural networks

RNNs are notoriously difficult to be trained. Vanilla RNNs, the kind of RNNs we have been talking about so far, suffer from vanishing and exploding gradients that give erratic results during training. As a result, RNNs have difficulty in learning long-range dependencies. For time series forecasting, going too many timesteps back in the past would be problematic. To address this problem, Long Short Term Memory (LSTM) and Gated Recurrent Unit (GRU), which are special types of RNNs, have been introduced. In this chapter, we will use LSTM and GRU to develop the time series forecasting models. Before this, let's review how RNNs are trained using Backpropagation Through Time (BPTT), a variant of the backpropagation ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required