Generally, for tasks involving sequential inputs, it is recommended to use Recurrent Neural Networks (RNNs). Such input is processed one element at a time, while maintaining a "state vector" (in hidden units). The state implicitly contains information about all the past elements in the sequence.
Typically, in conventional RNNs, it is difficult to store information for a long time. In order to remember the input for a long time, the network can be augmented with explicit memory. Also, this is the approach used in the Long Short-Term Memory (LSTM) networks; they use hidden units that can remember the input. LSTM networks have proved to be more effective than conventional RNNs.
In this section, we will ...