8

Regularization with Recurrent Neural Networks

In this chapter, we will work with Recurrent Neural Networks (RNNs). As we will see, they are well suited for Natural Language Processing (NLP) tasks, even if they also apply well to time series tasks. After learning how to train RNNs, we will apply several regularization methods, such as using dropout and the sequence maximum length. This will allow you to gain foundational knowledge that can be applied to NLP or time series-related tasks. This will also give you the necessary knowledge to understand more advanced techniques covered in the next chapter.

In this chapter, we’ll cover the following recipes:

  • Training an RNN
  • Training a Gated Recurrent Unit (GRU)
  • Regularizing with dropout
  • Regularizing ...

Get The Regularization Cookbook now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.