O'Reilly logo

Hands-On Natural Language Processing with Python by Rajalingappaa Shanmugamani, Rajesh Arumugam

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Memory module  

The magic of memory network models lies in their formulation of the memory module, which performs a soft attention mechanism over the fact embeddings. Literature on memory networks and other attention-based models introduces many different types of attention mechanisms, but all of them hinge on the concept of an element-wise dot product followed by a summation between two vectors as an operation measuring semantic or syntactic similarity. We will call it the reduce-dot operation, which receives two vectors and results in a single number denoting a similarity score.

We have formulated our attention mechanism as follows:

  1. The context vector is used to encode all the information required to produce an output and is initialized ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required