Chapter 8

State-of-the-Art Natural Language Processing

Learning Objectives

By the end of this chapter, you will be able to:

  • Evaluate vanishing gradients in long sentences
  • Describe an attention mechanism model as a state-of-the-art NLP domain
  • Assess one specific attention mechanism architecture
  • Develop a neural machine translation model using an attention mechanism
  • Develop a text summarization model using an attention mechanism

This chapter aims to acquaint you with the current practices and technologies in the NLP domain.

Introduction

In the last chapter, we studied Long Short Term Memory units (LSTMs), which help combat the vanishing gradient problem. We also studied GRU in detail, which has its own way of handling vanishing gradients. ...

Get Deep Learning for Natural Language Processing now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.