December 2019
Intermediate to advanced
468 pages
14h 28m
English
In Chapter 7, Understanding Recurrent Networks, we outlined several types of recurrent models, depending on the input-output combinations. One of them is indirect many-to-many or sequence-to-sequence (seq2seq), where an input sequence is transformed into another, different output sequence, not necessarily with the same length as the input. Machine translation is the most popular type of seq2seq task. The input sequences are the words of a sentence in one language and the output sequences are the words of the same sentence translated into another language. For example, we can translate the English sequence tourist attraction to the German touristenattraktion. Not only is the output sentence a different ...
Read now
Unlock full access