7 Attention
This chapter covers
- Implementing attention in MLPs and LSTMs
- Using attention to improve the performance of a deep learning model
- Explaining model outcomes by highlighting attention patterns connected to input data
7.1 Neural attention
In the field of neurocognition, attention is defined as a form of cognitive focus arising from the limited availability of computational resources. The brain, while powerful, is vulnerable to distraction during cognitive processing and tends to block out certain irrelevant information. For example, if you’re engaged in an intense phone call at work, you block out irrelevant stimuli from your coworkers in your context. On the other hand, if you are focusing on a hard cognitive task and someone starts ...
Get Deep Learning for Natural Language Processing now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.