Skip to Content
Deep Learning for Natural Language Processing
book

Deep Learning for Natural Language Processing

by Stephan Raaijmakers
November 2022
Beginner to intermediate content levelBeginner to intermediate
296 pages
8h 27m
English
Manning Publications
Content preview from Deep Learning for Natural Language Processing

7 Attention

This chapter covers

  • Implementing attention in MLPs and LSTMs
  • Using attention to improve the performance of a deep learning model
  • Explaining model outcomes by highlighting attention patterns connected to input data

7.1 Neural attention

In the field of neurocognition, attention is defined as a form of cognitive focus arising from the limited availability of computational resources. The brain, while powerful, is vulnerable to distraction during cognitive processing and tends to block out certain irrelevant information. For example, if you’re engaged in an intense phone call at work, you block out irrelevant stimuli from your coworkers in your context. On the other hand, if you are focusing on a hard cognitive task and someone starts ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Deep Learning for Natural Language Processing, 2nd Edition

Deep Learning for Natural Language Processing, 2nd Edition

Jon Krohn
Natural Language Processing in Action

Natural Language Processing in Action

Cole Howard, Hobson Lane, Hannes Hapke

Publisher Resources

ISBN: 9781617295447Supplemental ContentPublisher SupportOtherPublisher WebsiteSupplemental ContentErrata PagePurchase Link