Keras provides an LSTM layer that we will use here to construct and train a many-to-one RNN. Our network takes in a sentence (a sequence of words) and outputs a sentiment value (positive or negative). Our training set is a dataset of about 7,000 short sentences from UMICH SI650 sentiment classification competition on Kaggle (https://inclass.kaggle.com/c/si650winter11). Each sentence is labeled 1 or 0 for positive or negative sentiment respectively, which our network will learn to predict.
We start with the imports, as usual:
from keras.layers.core import Activation, Dense, Dropout, SpatialDropout1Dfrom keras.layers.embeddings import Embeddingfrom keras.layers.recurrent import LSTMfrom keras.models import ...