December 2019
Intermediate to advanced
468 pages
14h 28m
English
Next, we'll implement the AttnDecoderRNN decoder with Luong attention. This also works in combination with EncoderRNN.
We'll start with the AttnDecoderRNN.__init__ method:
class AttnDecoderRNN(torch.nn.Module): def __init__(self, hidden_size, output_size, max_length=MAX_LENGTH, dropout=0.1): super(AttnDecoderRNN, self).__init__() self.hidden_size = hidden_size self.output_size = output_size self.max_length = max_length # Embedding for the input word self.embedding = torch.nn.Embedding(self.output_size, self.hidden_size) self.dropout = torch.nn.Dropout(dropout) # Attention portion self.attn = torch.nn.Linear(in_features=self.hidden_size, out_features=self.hidden_size) self.w_c = torch.nn.Linear(in_features ...
Read now
Unlock full access