December 2018
Beginner to intermediate
684 pages
21h 9m
English
The addition of regularization is fairly straightforward. We can apply it to the dense encoder layer using Keras' activity_regularizer, as follows:
encoding_l1 = Dense(units=encoding_size, activation='relu', activity_regularizer=regularizers.l1(10e-5), name='Encoder_L1')(input_)
The input and decoding layers remain unchanged. In this example, with a compression of factor 24.5, regularization negatively affects performance with a test RMSE of 0.2946.