Fine-tuning learned embeddings from word2vec

In this example, we will use the same network as the one we used to learn our embeddings from scratch. In terms of code, the only major difference is an extra block of code to load the word2vec model and build up the weight matrix for the embedding layer.

As always, we start with the imports and set up a random seed for repeatability. In addition to the imports we have seen previously, there is an additional one to import the word2vec model from gensim:

from gensim.models import KeyedVectorsfrom keras.layers.core import Dense, Dropout, SpatialDropout1Dfrom keras.layers.convolutional import Conv1Dfrom keras.layers.embeddings import Embeddingfrom keras.layers.pooling import GlobalMaxPooling1Dfrom ...

Get Deep Learning with Keras now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.