11 Sequence-to-sequence
This chapter covers
- Preparing a sequence-to-sequence dataset and loader
- Combining RNNs with attention mechanisms
- Building a machine translation model
- Interpreting attentionscores to understand a model’s decisions
Now that we have learned about attention mechanisms, we can wield them to build something new and powerful. In particular, we will develop an algorithm known as sequence-to-sequence (Seq2Seq for short) that can perform machine translation. As the name implies, this is an approach for getting neural networks to take one sequence as input and produce a different sequence as the output. Seq2Seq has been used to get computers to perform symbolic calculus,1 summarize long documents,2 and even translate from one language ...
Get Inside Deep Learning now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.