August 2017
Intermediate to advanced
288 pages
8h 6m
English
The LSTM has a similar structure to RNN, however, the basic cell is very different as traditional RNN uses single multi-layer perceptron (MLP), whereas a single cell of LSTM includes four input layers interacting with each other. These three layers are:
The forget gate in LSTM decides which information to throw away and it depends on the last hidden state output ht-1, Xt, which represents input at time t.

In the earlier figure, Ct represents cell state at time t. The input data is represented by Xt and the hidden state is represented as ht-1. The earlier layer ...