Chapter 11

Text Autocompletion with LSTM and Beam Search

In Chapter 9, “Predicting Time Sequences with Recurrent Neural Networks,” we explored how to use recurrent neural networks (RNNs) for prediction of numerical values. In this chapter, instead of working with a time sequence of numerical values, we apply our RNN to natural language text (English). There are two straightforward ways of doing this. We can view text as a sequence of characters or as a sequence of words. In this chapter, we look at it as a sequence of characters because that is the simplest way to get started. In many cases, it is more powerful to work with words than with characters, and this is explored in the next couple of chapters.

In addition to working with text instead ...

Get Learning Deep Learning: Theory and Practice of Neural Networks, Computer Vision, NLP, and Transformers using TensorFlow now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.