Skip to Main Content
Grokking Deep Learning
book

Grokking Deep Learning

by Andrew W. Trask
February 2019
Intermediate to advanced content levelIntermediate to advanced
336 pages
9h 29m
English
Manning Publications
Content preview from Grokking Deep Learning

Chapter 14. Learning to write like Shakespeare: long short-term memory

In this chapter

  • Character language modeling
  • Truncated backpropagation
  • Vanishing and exploding gradients
  • A toy example of RNN backpropagation
  • Long short-term memory (LSTM) cells

“Lord, what fools these mortals be!”

William Shakespeare A Midsummer Night’s Dream

Character language modeling

Let’s tackle a more challenging task with the RNN

At the end of chapters 12 and 13, you trained vanilla recurrent neural networks (RNNs) that learned a simple series prediction problem. But you were training over a toy dataset of phrases that were synthetically generated using rules.

In this chapter, you’ll attempt language modeling over a much more challenging dataset: the works of ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Generative Deep Learning

Generative Deep Learning

David Foster
Deep Learning with PyTorch

Deep Learning with PyTorch

Eli Stevens, Thomas Viehmann, Luca Pietro Giovanni Antiga

Publisher Resources

ISBN: 9781617293702Supplemental ContentPublisher SupportOtherPublisher WebsiteSupplemental ContentPurchase Link