LSTM

LSTMs are a special form of RNN that excel at learning long-term dependencies. Developed by Hochreiter and Schmidhuber in 1997, LSTMs have several different layers that information passes through to help them keep what is important and jettison the rest. Unlike vanilla recurrent networks, the LSTM has not one but two states; the standard hidden state that we've been representing as ht, as well as a state that is specific to the LSTM cell called the cell state, which we will denote with ct. These LSTM states are able to update or adjust these states with gating mechanisms. These gates help to control the processing of information through the cell, and consist of an activation function and a basic point wise operation, such as vector multiplication. ...

Get Hands-On Artificial Intelligence for Beginners now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.