December 2019
Intermediate to advanced
468 pages
14h 28m
English
We'll implement the training itself with the help of the train function, which takes the network and the cuda device as parameters. We'll use cross-entropy loss and the Adam optimizer (the usual combination for classification tasks). The function simply iterates EPOCHS times and calls the train_epoch and test functions for each epoch. The following is the implementation:
def train(model: torch.nn.Module, device: torch.device): loss_function = torch.nn.CrossEntropyLoss() optimizer = torch.optim.Adam(model.parameters()) train_loader, val_order = create_datasets() # read datasets # train for epoch in range(EPOCHS): print('Epoch {}/{}'.format(epoch + 1, EPOCHS)) train_epoch(model, device, loss_function, optimizer, train_loader) test(model ...
Read now
Unlock full access