Skip to Content
Java Deep Learning Projects
book

Java Deep Learning Projects

by Md. Rezaul Karim
June 2018
Intermediate to advanced
436 pages
10h 33m
English
Packt Publishing
Content preview from Java Deep Learning Projects

Multilayer Perceptron

As discussed earlier, a single perceptron is even incapable of approximating an XOR function. To overcome this limitation, multiple perceptrons are stacked together as MLPs, where layers are connected as a directed graph. This way, the signal propagates one way, from input layer to hidden layers to output layer, as shown in the following diagram:

An MLP architecture having an input layer, two hidden layers, and an output layer

Fundamentally, an MLP is one the most simple FFNNs having at least three layers: an input layer, a hidden layer, and an output layer. An MLP was first trained with a backpropogation algorithm in ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Java Deep Learning Essentials

Java Deep Learning Essentials

Yusuke Sugomori
Machine Learning in Java - Second Edition

Machine Learning in Java - Second Edition

AshishSingh Bhatia, Bostjan Kaluza
Mastering Java Machine Learning

Mastering Java Machine Learning

Uday Kamath, Krishna Choppella

Publisher Resources

ISBN: 9781788997454Supplemental Content