February 2018
Intermediate to advanced
450 pages
11h 27m
English
LSTM, a variation of an RNN that is used to help learning long term dependencies in the text. LSTMs were initially introduced by Hochreiter & Schmidhuber (1997) (link: http://www.bioinf.jku.at/publications/older/2604.pdf), and many researchers worked on it and produced interesting results in many domains.
These kind of architectures will be able to handle the problem of long-term dependencies in the text because of its inner architecture.
LSTMs are similar to the vanilla RNN as it has a repeating module over time, but the inner architecture of this repeated module is different from the vanilla RNNs. It includes more layers for forgetting and updating information:
Read now
Unlock full access