Skip to Content
Python Deep Learning - Second Edition
book

Python Deep Learning - Second Edition

by Ivan Vasilev, Daniel Slater, Gianmario Spacagna, Peter Roelants, Valentino Zocca
January 2019
Intermediate to advanced
386 pages
11h 13m
English
Packt Publishing
Content preview from Python Deep Learning - Second Edition

Sequence to sequence with attention

The decoder has to generate the entire output sequence based solely on the thought vector. For this to work, the thought vector has to encode the entire information of the input sequence. However, the encoder is an RNN and we can expect that its hidden state will carry more information about the latest sequence elements, compared to the earliest.

Using LSTM cells and reversing the input helps, but cannot prevent it entirely. Because of this, the thought vector becomes something of a bottleneck and the seq2seq model works well for short sentences, but the performance deteriorates for longer ones. To solve this problem, Bahdanau et al. (https://arxiv.org/abs/1409.0473) proposed a seq2seq extension called ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Python Deep Learning

Python Deep Learning

Valentino Zocca, Gianmario Spacagna, Daniel Slater, Peter Roelants

Publisher Resources

ISBN: 9781789348460Supplemental Content