Bidirectional RNN

In a bidirectional RNN, we have two different layers of hidden units. Both of these layers connect from the input layer to the output layer. In one layer, the hidden states are shared from left to right, and in the other layer, they are shared from right to left.

But what does this mean? To put it simply, one hidden layer moves forward through time from the start of the sequence, while the other hidden layer moves backward through time from the end of the sequence.

As shown in the following diagram, we have two hidden layers: a forward hidden layer and a backward hidden layer, which are described as follows:

  • In the forward hidden layer, hidden state values are shared from past time steps, that is, is shared to , is shared ...

Get Hands-On Deep Learning Algorithms with Python now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.