Skip to Content
Hands-On Mathematics for Deep Learning
book

Hands-On Mathematics for Deep Learning

by Jay Dawani
June 2020
Intermediate to advanced
364 pages
13h 56m
English
Packt Publishing
Content preview from Hands-On Mathematics for Deep Learning

Cross entropy

Cross entropy loss is used mostly when we have a binary classification problem; that is, where the network outputs either 1 or 0.

Suppose we are given a training dataset, and . We can then write this in the following form:

Here, θ is the parameters of the network (weights and biases). We can express this in terms of a Bernoulli distribution, as follows:

The probability, given the entire dataset, is then as follows:

If we take ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Math for Deep Learning

Math for Deep Learning

Ronald T. Kneusel
Deep Learning with PyTorch

Deep Learning with PyTorch

Eli Stevens, Thomas Viehmann, Luca Pietro Giovanni Antiga

Publisher Resources

ISBN: 9781838647292