February 2018
Intermediate to advanced
262 pages
6h 59m
English
PyTorch has most of the common non-linear activation functions implemented for us already and it can be used like any other layer. Let's see a quick example of how to use the ReLU function in PyTorch:
sample_data = Variable(torch.Tensor([[1,2,-1,-1]]))myRelu = ReLU()myRelu(sample_data)Output:Variable containing: 1 2 0 0[torch.FloatTensor of size 1x4]
In the preceding example, we take a tensor with two positive values and two negative values and apply a ReLU on it, which thresholds the negative numbers to 0 and retains the positive numbers as they are.
Now we have covered most of the details required for building a network architecture, let's build a deep learning architecture that can be used to solve real-world ...