Hyperparameter tuning

In this experiment, we will vary the units in the dense layer, the dropout rate, and the batch size to obtain values that help us improve classification performance. This also illustrates an efficient way of obtaining suitable parameter values through experimentation. We will start by creating a TransferLearning.R file using the following code:

# Model with RESNET50pretrained <- application_resnet50(weights = 'imagenet',                                   include_top = FALSE,                                   input_shape = c(224, 224, 3))# Flags for hyperparameter tuningFLAGS <- flags(flag_integer("dense_units", 256),               flag_numeric("dropout", 0.1),               flag_integer("batch_size", 10))# Model architecturemodel <- keras_model_sequential() %>%          pretrained %>%          layer_flatten() %>%  layer_dense(units ...

Get Advanced Deep Learning with R now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.