How to do it...

With this, let's look at how we can build the encoder decoder architecture, along with the attention mechanism. The code file is available aMachine_translation.ipynb in GitHub.

  1. Build the encoder, as shown in the following code:
encoder_inputs = Input(shape=(eng_max_length,))en_x= Embedding(num_encoder_tokens+1, embedding_size)(encoder_inputs)en_x = Dropout(0.1)(en_x)encoder = LSTM(256, return_sequences=True, unroll=True)(en_x)encoder_last = encoder[:,-1,:]
  1. Build the decoder, as follows:
decoder_inputs = Input(shape=(fr_max_length,))dex= Embedding(num_decoder_tokens+1, embedding_size)decoder= dex(decoder_inputs)decoder = Dropout(0.1)(decoder)decoder = LSTM(256, return_sequences=True, unroll=True)(decoder, initial_state=[encoder_last, ...

Get Neural Networks with Keras Cookbook now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.