Skip to Content
Hands-On Unsupervised Learning with Python
book

Hands-On Unsupervised Learning with Python

by Giuseppe Bonaccorso
February 2019
Intermediate to advanced
386 pages
9h 54m
English
Packt Publishing
Content preview from Hands-On Unsupervised Learning with Python

Adding a sparseness constraint to the deep convolutional autoencoder

In this example, we want to increase the sparsity of the code by using an L1 penalty. The DAG and the training process are exactly the same as the main example, and the only difference is the loss function, which now becomes the following:

...sparsity_constraint = 0.01 * tf.reduce_sum(tf.norm(code_layer, ord=1, axis=1))loss = tf.nn.l2_loss(convt_3 - input_images) + sparsity_constraint...

We have added a sparsity constraint with α = 0.01; therefore, we can retrain the model by checking the average code length. The output of the process is as follows:

Epoch 1) Average loss per sample: 12.785746307373048 (Code mean: 0.30300647020339966) Epoch 2) Average loss per sample: 10.576686706542969 ...
Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Hands-On Unsupervised Learning Using Python

Hands-On Unsupervised Learning Using Python

Ankur A. Patel
Introduction to Machine Learning with Python

Introduction to Machine Learning with Python

Andreas C. Müller, Sarah Guido

Publisher Resources

ISBN: 9781789348279Supplemental Content