Sequence to sequence (seq2seq) is a particular kind of RNN with successful applications in neural machine translation, text summarization, and speech recognition. In this recipe, we will discuss how to implement a neural machine translation with results similar to the one achieved by the Google Neural Machine Translation system (https://research.googleblog.com/2016/09/a-neural-network-for-machine.html ). The key idea is to input a whole sequence of text, understand the entire meaning, and then output the translation as another sequence. The idea of reading an entire sequence is very different from the previous architectures, where a fixed set of words was translated from one source language ...
Neural machine translation - training a seq2seq RNN
Get TensorFlow 1.x Deep Learning Cookbook now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.