Chapter 6

Gated Recurrent Units (GRUs)

Learning Objectives

By the end of this chapter, you will be able to:

  • Assess the drawback of simple Recurrent Neural Networks (RNNs)
  • Describe the architecture of Gated Recurrent Units (GRUs)
  • Perform sentiment analysis using GRUs
  • Apply GRUs for text generation

The chapter aims to provide a solution to the existing drawbacks of the current architecture of RNNs.

Introduction

In previous chapters, we studied text processing techniques such as word embedding, tokenization, and Term Frequency Inverse Document Frequency (TFIDF). We also learned about a specific network architecture called a Recurrent Neural Network (RNN) that has the drawback of vanishing gradients.

In this chapter, we are going to study ...

Get Deep Learning for Natural Language Processing now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.