Chapter 6

Gated Recurrent Units (GRUs)

Learning Objectives

By the end of this chapter, you will be able to:

  • Assess the drawback of simple Recurrent Neural Networks (RNNs)
  • Describe the architecture of Gated Recurrent Units (GRUs)
  • Perform sentiment analysis using GRUs
  • Apply GRUs for text generation

The chapter aims to provide a solution to the existing drawbacks of the current architecture of RNNs.


In previous chapters, we studied text processing techniques such as word embedding, tokenization, and Term Frequency Inverse Document Frequency (TFIDF). We also learned about a specific network architecture called a Recurrent Neural Network (RNN) that has the drawback of vanishing gradients.

In this chapter, we are going to study ...

Get Deep Learning for Natural Language Processing now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.