Skip to Content
Hands-On Machine Learning for Algorithmic Trading
book

Hands-On Machine Learning for Algorithmic Trading

by Stefan Jansen
December 2018
Beginner to intermediate
684 pages
21h 9m
English
Packt Publishing
Content preview from Hands-On Machine Learning for Algorithmic Trading

Hidden units

Hidden units are unique to the design of neural networks, and several non-linear activation functions have been used successfully. The design of hidden activation functions remains an area of research because it has a critical impact on the training process.

A very popular class of activation functions are piece-wise linear units, such as the Rectified Linear Unit (ReLU). The functional form is similar to the payoff for a call option and the activation is computed as g(z) = max(0, z) for a given activation, z. As a result, the derivative is constant whenever the unit is active. ReLUs are usually combined with an affine transformation of the inputs. They are often used instead of sigmoid units and their discovery has greatly ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Machine Learning for Algorithmic Trading - Second Edition

Machine Learning for Algorithmic Trading - Second Edition

Stefan Jansen

Publisher Resources

ISBN: 9781789346411Supplemental Content