Skip to Content
Python Deep Learning - Second Edition
book

Python Deep Learning - Second Edition

by Ivan Vasilev, Daniel Slater, Gianmario Spacagna, Peter Roelants, Valentino Zocca
January 2019
Intermediate to advanced
386 pages
11h 13m
English
Packt Publishing
Content preview from Python Deep Learning - Second Edition

Character-based models for generating new text

In this section, we'll discuss how to generate new text using character-based models via TensorFlow (TF). This is an example of a "many-to-many" relationship, such as the one we defined in the Recurrent neural networks section. We'll only discuss the most interesting code sections, but the full example lives at https://github.com/ivan-vasilev/Python-Deep-Learning-SE/tree/master/ch07/language%20model.

In most cases, language modeling is performed at the word level, where the distribution is over a fixed vocabulary of |V| words. Vocabularies in realistic tasks, such as the language models used in speech recognition, often exceed 100,000 words. This large dimensionality makes modeling the output ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Python Deep Learning

Python Deep Learning

Valentino Zocca, Gianmario Spacagna, Daniel Slater, Peter Roelants

Publisher Resources

ISBN: 9781789348460Supplemental Content