Chapter 14. Learning to write like Shakespeare: long short-term memory
In this chapter
- Character language modeling
- Truncated backpropagation
- Vanishing and exploding gradients
- A toy example of RNN backpropagation
- Long short-term memory (LSTM) cells
“Lord, what fools these mortals be!”
William Shakespeare A Midsummer Night’s Dream
Character language modeling
Let’s tackle a more challenging task with the RNN
At the end of chapters 12 and 13, you trained vanilla recurrent neural networks (RNNs) that learned a simple series prediction problem. But you were training over a toy dataset of phrases that were synthetically generated using rules.
In this chapter, you’ll attempt language modeling over a much more challenging dataset: the works of ...
Get Grokking Deep Learning now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.