October 2018
Intermediate to advanced
472 pages
10h 57m
English
Here, we will define the RNN decoder which will take the encoded features from the encoder. The features are fed into the attention layer, which is concatenated with the input embedding vector. Then, the concatenated vector is passed into the GRU module, which is further passed through two fully connected layers:
class RNN_Decoder(tf.keras.Model): def __init__(self, embedding_dim, units, vocab_size): super(RNN_Decoder, self).__init__() self.units = units self.embedding = tf.keras.layers.Embedding(vocab_size, embedding_dim) self.gru = gru(self.units) self.fc1 = tf.keras.layers.Dense(self.units) self.fc2 = tf.keras.layers.Dense(vocab_size) self.attention = BahdanauAttention(self.units) def call(self, x, features, hidden): # defining ...
Read now
Unlock full access