Generating text

In our discussions involving deep learning and natural language processing, we extensively spoke about how it is used in text generation to very convincing results we are now going to get our hands dirty with a little bit of text generation ourselves.

The neural network architecture we will be using is a recurrent neural network, and in particular, an LSTM [9]. LSTM stands for Long Short Term Memory and is unique because its architecture allows it to capture both short term and long term context of words in a sentence. The very popular blog post Understanding LSTM Networks [11] by deep learning researcher Colah is a great way to further understand LSTMs.

This is the same architecture used in the popular blog post [10] by ...

Get Natural Language Processing and Computational Linguistics now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.