Implementing the encoder

Next, let's continue with implementing the encoder.

We'll start with the constructor:

class EncoderRNN(torch.nn.Module):    def __init__(self, input_size, hidden_size):        super(EncoderRNN, self).__init__()        self.input_size = input_size        self.hidden_size = hidden_size        # Embedding for the input words        self.embedding = torch.nn.Embedding(input_size, hidden_size)        # The actual rnn sell        self.rnn_cell = torch.nn.GRU(hidden_size, hidden_size)

The entry point is the self.embedding module. It will take the index of each word and it will return its assigned embedding vector. We will not use pretrained word vectors (such as GloVe), but nevertheless, the concept of embedding vectors is the same—it's just that we'll initialize them ...

Get Advanced Deep Learning with Python now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.