February 2018
Intermediate to advanced
262 pages
6h 59m
English
In the previous sections, we have created DataLoader instances and algorithms. Now, let's train the model. To do this we need a loss function and an optimizer:
# Loss and Optimizerlearning_rate = 0.001criterion = nn.CrossEntropyLoss()optimizer_ft = optim.SGD(model_ft.parameters(), lr=0.001, momentum=0.9)exp_lr_scheduler = lr_scheduler.StepLR(optimizer_ft, step_size=7, gamma=0.1)
In the preceding code, we created our loss function based on CrossEntropyLoss and the optimizer based on SGD. The StepLR function helps in dynamically changing the learning rate. We will discuss different strategies available to tune the learning rate in Chapter 4, Fundamentals of Machine Learning.
The following train_model function takes in a ...