LSTM network

Now, we'll train a two-layer LSTM network with 512 cells in each layer. The full code is available at Since we'll use truncated BPTT, we need to store the state between batches:

  1. First, we'll define placeholders for our input and targets. The placeholders are the links between the model and the training data. We can feed the network with a single batch by setting its values to the placeholders. The first dimension of both the input and targets is the batch size, while the second is along the text sequence. Both placeholders take batches of sequences where the characters are represented by their index:
self.inputs = tf.placeholder(tf.int32 ...

Get Python Deep Learning - Second Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.