Skip to Content
Hands-On Machine Learning for Algorithmic Trading
book

Hands-On Machine Learning for Algorithmic Trading

by Stefan Jansen
December 2018
Beginner to intermediate
684 pages
21h 9m
English
Packt Publishing
Content preview from Hands-On Machine Learning for Algorithmic Trading

Unfolding a computational graph with cycles

As suggested in the previous section, RNNs are called recurrent because they apply the same transformations to every element of a sequence in such a way that the output depends on the outcome of prior iterations. As a result, RNNs maintain an internal state that captures information about previous elements in the sequence, just like memory.

The following diagram shows the computational graph implied by a simple hidden RNN unit learning two weight matrices during its training:

  • Whh: Applies to the previous hidden state, ht-1
  • Whx: Applies to the current input, xt

A non-linear transformation of the sum of the two matrix multiplications—for example, using the tanh or ReLU activation functions—becomes ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Machine Learning for Algorithmic Trading - Second Edition

Machine Learning for Algorithmic Trading - Second Edition

Stefan Jansen

Publisher Resources

ISBN: 9781789346411Supplemental Content