How to do it...

We proceed with the recipe as follows:

  1. We import the necessary modules:
import tensorflow as tfimport numpy as npfrom tensorflow.examples.tutorials.mnist import input_dataimport matplotlib.pyplot as plt%matplotlib inline
  1. Load the MNIST dataset from TensorFlow examples:
mnist = input_data.read_data_sets("MNIST_data/")trX, trY, teX, teY = mnist.train.images, mnist.train.labels, mnist.test.images, mnist.test.labels
  1. Define the SparseAutoEncoder class, which is very similar to the autoencoder class in the previous recipe, except for introducing the KL divergence loss:
def kl_div(self, rho, rho_hat): term2_num = tf.constant(1.)- rho term2_den = tf.constant(1.) - rho_hat kl = self.logfunc(rho,rho_hat) + self.logfunc(term2_num, ...

Get TensorFlow 1.x Deep Learning Cookbook now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.