April 2018
Beginner to intermediate
282 pages
6h 52m
English
This additional step is applied to the convoluted layer to introduce non-linearity in the convoluted feature map. We learned about the ReLU function in the earlier section. The images have a highly non-linear pattern. When we apply convolution, there is a risk that it might become linear as there are linear operations like multiplication and summations. So, a non-linear activation function, such as ReLU, is used to preserve the non-linearity in the images.
The next stage in a CNN is the pooling layer.