How to do it...

With this, let us go ahead and build the model on the input and output datasets that we already prepared in the previous section (step 1 of many to hidden to many architecture of the previous section remains the same). The code file is available as Machine_translation.ipynb in GitHub.

  1. Build the model, as follows:
# We shall convert each word into a 128 sized vectorembedding_size = 128
    1. Prepare the encoder model:
encoder_inputs = Input(shape=(None,))en_x= Embedding(num_encoder_tokens+1, embedding_size)(encoder_inputs)encoder = LSTM(256, return_state=True)encoder_outputs, state_h, state_c = encoder(en_x)# We discard `encoder_outputs` and only keep the states.encoder_states = [state_h, state_c]

Note that we are using a ...

Get Neural Networks with Keras Cookbook now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.