Implementing the encoder

Next, let's continue with implementing the encoder.

We'll start with the constructor:

class EncoderRNN(torch.nn.Module):    def __init__(self, input_size, hidden_size):        super(EncoderRNN, self).__init__()        self.input_size = input_size        self.hidden_size = hidden_size        # Embedding for the input words        self.embedding = torch.nn.Embedding(input_size, hidden_size)        # The actual rnn sell        self.rnn_cell = torch.nn.GRU(hidden_size, hidden_size)

The entry point is the self.embedding module. It will take the index of each word and it will return its assigned embedding vector. We will not use pretrained word vectors (such as GloVe), but nevertheless, the concept of embedding vectors is the same—it's just that we'll initialize them ...

Get Advanced Deep Learning with Python now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.