Bahdanau attention

We can solve this problem with the help of the attention mechanism (see Neural Machine Translation by Jointly Learning to Align and Translate at https://arxiv.org/abs/1409.0473), an extension of the seq2seq model, that provides a way for the decoder to work with all encoder hidden states, not just the last one.

The type of attention mechanism in this section is called Bahdanau attention, after the author of the original paper.

Besides solving the bottleneck problem, the attention mechanism has some other advantages. For one, the immediate access to all previous states helps to prevent the vanishing gradients problem. It also allows for some interpretability of the results because we can see what parts of the input the ...

Get Advanced Deep Learning with Python now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.