Skip to Main Content
Machine Learning Quick Reference
book

Machine Learning Quick Reference

by Rahul Kumar
January 2019
Intermediate to advanced content levelIntermediate to advanced
294 pages
6h 43m
English
Packt Publishing
Content preview from Machine Learning Quick Reference

Overcoming vanishing gradient

From the preceding explanation of vanishing gradient, it comes out that the root cause of this problem is the sigmoid function being picked as an activation function. The similar problem has been detected when tanh is chosen as an activation function.

In order to counter such a scenario, the ReLU function comes to the rescue:

ReLU(x)= max(0,x)

If the input is negative or less than zero, the function outputs as zero. In the second scenario, if the input is greater than zero, then the output will be equal to input.

Let's take the derivative of this function and see what happens:

Case 1: x<0:

Case 2: x>0:

If we ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Data Mining and Machine Learning Applications

Data Mining and Machine Learning Applications

Rohit Raja, Kapil Kumar Nagwanshi, Sandeep Kumar, K. Ramya Laxmi
Advanced Machine Learning with R

Advanced Machine Learning with R

Cory Lesmeister, Dr. Sunil Kumar Chinnamgari
Natural Language Processing and Computational Linguistics

Natural Language Processing and Computational Linguistics

Brian Sacash, Bhargav Srinivasa-Desikan, Reddy Anil Kumar
TensorFlow Machine Learning Projects

TensorFlow Machine Learning Projects

Ankit Jain, Dr. Amita Kapoor

Publisher Resources

ISBN: 9781788830577Supplemental Content