June 2022
Intermediate to advanced
600 pages
17h 56m
English
This chapter covers
Now that we have learned about attention mechanisms, we can wield them to build something new and powerful. In particular, we will develop an algorithm known as sequence-to-sequence (Seq2Seq for short) that can perform machine translation. As the name implies, this is an approach for getting neural networks to take one sequence as input and produce a different sequence as the output. Seq2Seq has been used to get computers to perform symbolic calculus,1 summarize long documents,2 and even translate from one language ...
Read now
Unlock full access