Skip to Content
Java Deep Learning Projects
book

Java Deep Learning Projects

by Md. Rezaul Karim
June 2018
Intermediate to advanced
436 pages
10h 33m
English
Packt Publishing
Content preview from Java Deep Learning Projects

Activation functions

To allow a neural network to learn complex decision boundaries, we apply a non-linear activation function to some of its layers. Commonly used functions include Tanh, ReLU, softmax, and variants of these. More technically, each neuron receives as input signal the weighted sum of the synaptic weights and the activation values of the neurons connected. One of the most widely used functions for this purpose is the so-called sigmoid function. It is a special case of the logistic function, which is defined by the following formula:

The domain of this function includes all real numbers, and the co-domain is (0, 1). This means ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Java Deep Learning Essentials

Java Deep Learning Essentials

Yusuke Sugomori
Machine Learning in Java - Second Edition

Machine Learning in Java - Second Edition

AshishSingh Bhatia, Bostjan Kaluza
Mastering Java Machine Learning

Mastering Java Machine Learning

Uday Kamath, Krishna Choppella

Publisher Resources

ISBN: 9781788997454Supplemental Content