Experimenting with the LSTM network having an additional layer

In this second experiment to improve the performance of the classification model, we will add an extra LSTM layer. Let's have a look at the following code:

# Model architecturemodel <- keras_model_sequential() %>%         layer_embedding(input_dim = 500, output_dim = 32) %>%         layer_lstm(units = 32,                    return_sequences = TRUE) %>%         layer_lstm(units = 32) %>%         layer_dense(units = 1, activation = "sigmoid")# Compiling modelmodel %>% compile(optimizer = "adam",            loss = "binary_crossentropy",         metrics = c("acc"))# Fitting modelmodel_three <- model %>% fit(train_x, train_y,         epochs = 10,         batch_size = 128,         validation_split = 0.2)# Loss and accuracy plotplot(model_three)

By adding an extra LSTM layer ...

Get Advanced Deep Learning with R now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.