O'Reilly logo

Hands-On Natural Language Processing with Python by Rajalingappaa Shanmugamani, Rajesh Arumugam

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Generating text using RNNs

We used Long Short-Term Memory (LSTMs) and Gated Recurrent Units (GRUs) in previous chapters for text classification. In addition to being used for predictive tasks, RNNs can be used to create generative models, as well. RNNs can learn long-term dependencies from an input text, and can therefore generate completely new sequences. This generative model can be either character or word-based. In the next section, we will look at a simple word-based text generation model.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required