Skip to Content
Java Deep Learning Cookbook
book

Java Deep Learning Cookbook

by Rahul Raj
November 2019
Intermediate to advanced
304 pages
8h 40m
English
Packt Publishing
Content preview from Java Deep Learning Cookbook

How it works...

In step 1, while configuring generic neural network parameters, we set the default learning rate as shown here:

configBuilder.updater(new AdaGrad(learningRate));

The Adagrad optimizer is based on how frequently a parameter gets updated during training. Adagrad is based on a vectorized learning rate. The learning rate will be small when there are many updates received. This is crucial for high-dimensional problems. Hence, this optimizer can be a good fit for our autoencoder use case.

We are performing dimensionality reduction at the input layers in an autoencoder architecture. This is also known as encoding the data. We want to ensure that the same set of features are decoded from the encoded data. We calculate reconstruction ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Java Deep Learning Projects

Java Deep Learning Projects

Md. Rezaul Karim
Java: Data Science Made Easy

Java: Data Science Made Easy

Richard M. Reese, Jennifer L. Reese, Alexey Grigorev
Java 9 High Performance

Java 9 High Performance

Mayur Ramgir, Nick Samoylov
Introduction to Deep Learning Using PyTorch

Introduction to Deep Learning Using PyTorch

Goku Mohandas, Alfredo Canziani

Publisher Resources

ISBN: 9781788995207Supplemental Content