Chapter 12
Neural Language Models and Word Embeddings
In Chapter 11, “Text Autocompletion with LSTM and Beam Search,” we built a network that predicts the continuation of a sentence. One remarkable property of that model is that it learns both words and sentence structure. We did nothing to prevent the model from producing random, nonexistent words or producing sentences that make no sense grammatically, but somehow, this did not happen. Still, it seems like we made it unnecessarily hard for the model by giving it individual characters instead of words as the smallest building blocks. After all, humans do not actually communicate with characters—they use characters primarily as tools to describe, in writing, words they are communicating.
In ...
Get Learning Deep Learning: Theory and Practice of Neural Networks, Computer Vision, NLP, and Transformers using TensorFlow now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.