8 Long-and-Short-Term Memory (LSTM) Networks Architectures and Applications in Stock Price Prediction
Jaydip Sen and Sidra Mehtab
Praxis Business School, Kolkata, India
8.1 Introduction
The recurrent neural networks (RNNs) are a special type of neural network that is capable of processing and modeling sequential data such as text, speech, time-series, etc. In RNNs, the output of the network at a given time slot depends on the current input to the network as well as the previous state of the network [1]. Unfortunately, these networks are poor in capturing the long-term dependencies in the data due to a problem known as the vanishing or exploding gradients [2]. LSTM networks, a variant of RNNs, have the ability to overcome the problem of vanishing or exploding gradients, and hence such networks are quite effective and efficient in analyzing time series and other sequential data. LSTM networks consist of gates that are essentially memory cells in a computing machine. The gates store past information about their states and control the information flow over time. There are four types of gates in LSTM networks. The forget gates decide what information from the past to discard and what to retain at a given state. The input gates enable the network to control the input at the current state. The contents of the forget gates and the input gates are aggregated into a cell state vector. In other words, the cell state aggregates the old state information from the forget gate with the current ...
Get Emerging Computing Paradigms now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.