December 2019
Intermediate to advanced
468 pages
14h 28m
English
In this section, we'll use PyTorch 1.3.1 to implement a simple NMT example with the help of a seq2seq attention model. To clarify, we'll implement a seq2seq attention model, like the one we introduced in the Introducing seq2seq models section, and we'll extend it with Luong attention. The model encoder will take as input a text sequence (sentence) in one language and the decoder will output the corresponding sequence translated into another language.
Read now
Unlock full access