Here we instantiate a sequential model and add the following layers:
- LSTM
- Dense
The following steps describe the preceding points in detail:
- The dense layer with one output is as follows:
model = Sequential() model.add(LSTM(n_neurons, batch_input_shape=(n_batch, X.shape[1], X.shape[2]), stateful=True)) model.add(Dense(1))
- Then, we compile the model using model. compile(..), with loss and optimizers, shown as follows:
model.compile(loss='mean_squared_error', optimizer='adam')
- We are using MSE as the loss function and Adam as an optimizer. MSE is a loss function that uses a sum of squared difference between the predicted value and the actual value divided by 1/n where n is the total sample size.