January 2019
Intermediate to advanced
386 pages
11h 13m
English
In this section, we'll discuss how to generate new text using character-based models via TensorFlow (TF). This is an example of a "many-to-many" relationship, such as the one we defined in the Recurrent neural networks section. We'll only discuss the most interesting code sections, but the full example lives at https://github.com/ivan-vasilev/Python-Deep-Learning-SE/tree/master/ch07/language%20model.
In most cases, language modeling is performed at the word level, where the distribution is over a fixed vocabulary of |V| words. Vocabularies in realistic tasks, such as the language models used in speech recognition, often exceed 100,000 words. This large dimensionality makes modeling the output ...