In this example, we will use the same network as the one we used to learn our embeddings from scratch. In terms of code, the only major difference is an extra block of code to load the word2vec model and build up the weight matrix for the embedding layer.
As always, we start with the imports and set up a random seed for repeatability. In addition to the imports we have seen previously, there is an additional one to import the word2vec model from gensim:
from gensim.models import KeyedVectorsfrom keras.layers.core import Dense, Dropout, SpatialDropout1Dfrom keras.layers.convolutional import Conv1Dfrom keras.layers.embeddings import Embeddingfrom keras.layers.pooling import GlobalMaxPooling1Dfrom ...