February 2018
Intermediate to advanced
262 pages
6h 59m
English
We have already observed a couple of times that all the features that are being passed to either machine learning or deep learning algorithms are normalized; that is, the values of the features are centered to zero by subtracting the mean from the data, and giving the data a unit standard deviation by dividing the data by its standard deviation. We would generally do this by using the PyTorch torchvision.Normalize method. The following code shows an example:
transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))
In all the examples we have seen, the data is normalized just before it enters a neural network; there is no guarantee that the intermediate layers get a normalized input. The following figure shows how the intermediate ...