Well, our model did not quite work before with simple feed-forward networks. In this section, we will try a different model: Bi-directional LSTM networks.
Recall that LSTM networks preserve parts of the previous information via the hidden state. However, this information is only about the past.
Bidirectional LSTM run both ways—from past to future and back! The LSTM that runs backwards preserves information from the future. Using the two hidden states combined, you are able to keep the context of both past and future. Clearly this would not make sense for stock price prediction! Their use was initially justified in the domain of speech recognition because, as you might know from experience, the context of the ...