Skip to Content
Hands-On Automated Machine Learning
book

Hands-On Automated Machine Learning

by Sibanjan Das, Umit Mert Cakmak
April 2018
Beginner to intermediate content levelBeginner to intermediate
282 pages
6h 52m
English
Packt Publishing
Content preview from Hands-On Automated Machine Learning

The ReLU function

Researchers have found out that a neural network using Rectified Linear Unit (ReLU) function, trains faster than other non-linear functions like sigmoid and tanh without a significant drop in accuracy. So, the ReLU  function is one of the most important activation functions. It gives an output of x, if x is positive, and O otherwise. 

It is defined as the following:

A(x) = max (0,x)

The ReLU function is as shown in the following figure:

ReLU is a non-linear function and the combination of ReLU functions are also non-linear. The range of ReLU is from 0 to infinity.

Next, we discuss the tanh function, which is very similar ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Automated Machine Learning

Automated Machine Learning

Adnan Masood
R: Unleash Machine Learning Techniques

R: Unleash Machine Learning Techniques

Raghav Bali, Dipanjan Sarkar, Brett Lantz, Cory Lesmeister

Publisher Resources

ISBN: 9781788629898Supplemental Content