LSTM

LSTMs are a special form of RNN that excel at learning long-term dependencies. Developed by Hochreiter and Schmidhuber in 1997, LSTMs have several different layers that information passes through to help them keep what is important and jettison the rest. Unlike vanilla recurrent networks, the LSTM has not one but two states; the standard hidden state that we've been representing as ht, as well as a state that is specific to the LSTM cell called the cell state, which we will denote with ct. These LSTM states are able to update or adjust these states with gating mechanisms. These gates help to control the processing of information through the cell, and consist of an activation function and a basic point wise operation, such as vector multiplication. ...

Get Hands-On Artificial Intelligence for Beginners now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.