Skip to Content
Java Deep Learning Projects
book

Java Deep Learning Projects

by Md. Rezaul Karim
June 2018
Intermediate to advanced
436 pages
10h 33m
English
Packt Publishing
Content preview from Java Deep Learning Projects

Output layer

The number of input neurons is equal to the output of the hidden layer 1. Then the number of outputs is equal to the number of predicted labels. We set a smaller value yet again, considering a very few inputs and features.

Here we used the Softmax activation function, which gives us a probability distribution over classes (the outputs sum to 1.0), and the losses function as cross-entropy for binary classification (XNET) since we want to convert the output (probability) to a discrete class, that is, zero or one:

OutputLayer output_layer = new OutputLayer.Builder(LossFunction.XENT) // XENT for Binary Classification                .weightInit(WeightInit.XAVIER)                .activation(Activation.SOFTMAX)                .nIn(16).nOut(numOutputs)                .build();
XNET is used for ...
Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Java Deep Learning Essentials

Java Deep Learning Essentials

Yusuke Sugomori
Machine Learning in Java - Second Edition

Machine Learning in Java - Second Edition

AshishSingh Bhatia, Bostjan Kaluza
Mastering Java Machine Learning

Mastering Java Machine Learning

Uday Kamath, Krishna Choppella

Publisher Resources

ISBN: 9781788997454Supplemental Content