Skip to Content
Hands-On Automated Machine Learning
book

Hands-On Automated Machine Learning

by Sibanjan Das, Umit Mert Cakmak
April 2018
Beginner to intermediate content levelBeginner to intermediate
282 pages
6h 52m
English
Packt Publishing
Content preview from Hands-On Automated Machine Learning

The ReLU layer

This additional step is applied to the convoluted layer to introduce non-linearity in the convoluted feature map. We learned about the ReLU function in the earlier section. The images have a highly non-linear pattern. When we apply convolution, there is a risk that it might become linear as there are linear operations like multiplication and summations. So, a non-linear activation function, such as ReLU, is used to preserve the non-linearity in the images.

The next stage in a CNN is the pooling layer.

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Automated Machine Learning

Automated Machine Learning

Adnan Masood
R: Unleash Machine Learning Techniques

R: Unleash Machine Learning Techniques

Raghav Bali, Dipanjan Sarkar, Brett Lantz, Cory Lesmeister

Publisher Resources

ISBN: 9781788629898Supplemental Content