Visualizing latent spaces with t-SNE
We now have an autoencoder that takes in a credit card transaction and outputs a credit card transaction that looks more or less the same. However, this is not why we built the autoencoder. The main advantage of an autoencoder is that we can now encode the transaction into a lower dimensional representation that captures the main elements of the transaction.
To create the encoder model, all we have to do is to define a new Keras model that maps from the input to the encoded state:
encoder = Model(data_in,encoded)
Note that you don't need to train this model again. The layers keep the weights from the previously trained autoencoder.
To encode our data, we now use the encoder model:
enc = encoder.predict(X_test)
Get Machine Learning for Finance now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.