5 Sequential NLP

This chapter covers

  • Using memory to analyze sequential NLP tasks
  • Understanding how RNNs, LSTM networks, and end-to-end memory networks handle memory
  • Applying these techniques to a shared task: Question Answering

The central task in this chapter is Question Answering: answering a question based on a number of facts. This task involves using memory: facts are stored in memory, and the question refers back to past information. How do the various models for sequential processing stack up to this task?

We will demonstrate the difference between flat memory approaches, like recurrent neural networks (RNNs), and long short-term memory (LSTM) networks and responsive memory approaches, like end-to-end memory networks, in the context ...

Get Deep Learning for Natural Language Processing now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.