March 2018
Intermediate to advanced
484 pages
10h 31m
English
LSTM networks are equipped with special hidden units, called memory cells, whose purpose is to remember the previous input for a long time. These cells take, at each instant of time, the previous state and the current input of the network as input. By combining them with the current contents of memory, and deciding what to keep and what to delete from memory with a gating mechanism by other units, LSTM has proved to be very useful and an effective way of learning long-term dependency.
In this chapter, we discussed RNNs. We saw how to make predictions with data that has a high temporal dependency. We saw how to develop several real-life predictive models that make the predictive analytics easier using RNNs and the different architectural ...