Skip to Content
Deep Learning for Natural Language Processing
book

Deep Learning for Natural Language Processing

by Stephan Raaijmakers
November 2022
Beginner to intermediate content levelBeginner to intermediate
296 pages
8h 27m
English
Manning Publications
Content preview from Deep Learning for Natural Language Processing

5 Sequential NLP

This chapter covers

  • Using memory to analyze sequential NLP tasks
  • Understanding how RNNs, LSTM networks, and end-to-end memory networks handle memory
  • Applying these techniques to a shared task: Question Answering

The central task in this chapter is Question Answering: answering a question based on a number of facts. This task involves using memory: facts are stored in memory, and the question refers back to past information. How do the various models for sequential processing stack up to this task?

We will demonstrate the difference between flat memory approaches, like recurrent neural networks (RNNs), and long short-term memory (LSTM) networks and responsive memory approaches, like end-to-end memory networks, in the context ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Deep Learning for Natural Language Processing, 2nd Edition

Deep Learning for Natural Language Processing, 2nd Edition

Jon Krohn
Natural Language Processing in Action

Natural Language Processing in Action

Cole Howard, Hobson Lane, Hannes Hapke

Publisher Resources

ISBN: 9781617295447Supplemental ContentPublisher SupportOtherPublisher WebsiteSupplemental ContentErrata PagePurchase Link