Chapter 8

State-of-the-Art Natural Language Processing

Learning Objectives

By the end of this chapter, you will be able to:

  • Evaluate vanishing gradients in long sentences
  • Describe an attention mechanism model as a state-of-the-art NLP domain
  • Assess one specific attention mechanism architecture
  • Develop a neural machine translation model using an attention mechanism
  • Develop a text summarization model using an attention mechanism

This chapter aims to acquaint you with the current practices and technologies in the NLP domain.


In the last chapter, we studied Long Short Term Memory units (LSTMs), which help combat the vanishing gradient problem. We also studied GRU in detail, which has its own way of handling vanishing gradients. ...

Get Deep Learning for Natural Language Processing now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.