Skip to Content
Hands-On Convolutional Neural Networks with TensorFlow
book

Hands-On Convolutional Neural Networks with TensorFlow

by Iffat Zafar, Giounona Tzanidou, Richard Burton, Nimesh Patel, Leonardo Araujo
August 2018
Intermediate to advanced
272 pages
7h 2m
English
Packt Publishing
Content preview from Hands-On Convolutional Neural Networks with TensorFlow

Kullback-Leibler divergence

The KL divergence loss is one that will produce a number indicating how close two distributions are to each other.

The closer two distributions get to each other, the lower the loss becomes. In the following graph, the blue distribution is trying to model the green distribution. As the blue distribution comes closer and closer to the green one, the KL divergence loss will get closer to zero.

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Computer Vision Using Deep Learning: Neural Network Architectures with Python and Keras

Computer Vision Using Deep Learning: Neural Network Architectures with Python and Keras

Vaibhav Verdhan

Publisher Resources

ISBN: 9781789130331Supplemental Content