February 2018
Intermediate to advanced
262 pages
6h 59m
English
It is a two-step process to tell PyTorch not to change the weights of the embedding layer:
The following code demonstrates how easy it is to freeze the embedding layer weights and instruct the optimizer not to use those parameters:
model.embedding.weight.requires_grad = Falseoptimizer = optim.SGD([ param for param in model.parameters() if param.requires_grad == True],lr=0.001)
We generally pass all the model ...