Skip to Content
Java Deep Learning Cookbook
book

Java Deep Learning Cookbook

by Rahul Raj
November 2019
Intermediate to advanced
304 pages
8h 40m
English
Packt Publishing
Content preview from Java Deep Learning Cookbook

There's more...

The ReLU activation function is non-linear, hence, the backpropagation of errors can easily be performed. Backpropagation is the backbone of neural networks. This is the learning algorithm that computes gradient descent with respect to weights across neurons. The following are ReLU variations currently supported in DL4J:

  • ReLU: The standard ReLU activation function:
public static final Activation RELU
  • ReLU6: ReLU activation, which is capped at 6, where 6 is an arbitrary choice:
public static final Activation RELU6
  • RReLU: The randomized ReLU activation function:
public static final Activation RRELU
  • ThresholdedReLU: Threshold ReLU:
public static final Activation THRESHOLDEDRELU

There are a few more implementations, such ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Java Deep Learning Projects

Java Deep Learning Projects

Md. Rezaul Karim
Java: Data Science Made Easy

Java: Data Science Made Easy

Richard M. Reese, Jennifer L. Reese, Alexey Grigorev
Java 9 High Performance

Java 9 High Performance

Mayur Ramgir, Nick Samoylov
Introduction to Deep Learning Using PyTorch

Introduction to Deep Learning Using PyTorch

Goku Mohandas, Alfredo Canziani

Publisher Resources

ISBN: 9781788995207Supplemental Content