Convolutional and subsampling operations in DL4j

In this subsection, we will see an example of how to construct a CNN for MNIST data classification. The network will have two convolutional layers, two subsampling layers, one dense layer, and the output layer as the fully connected layer. The first layer is a convolutional layer followed by a subsampling layer, which is again followed by another convolutional layer. Then, a subsampling layer is followed by a dense layer, which is followed by an output layer.

Let's see how these layers would look like using DL4j. The first convolution layer with ReLU as activation function:

val layer_0 = new ConvolutionLayer.Builder(5, 5)    .nIn(nChannels)    .stride(1, 1)    .nOut(20)    .activation("relu") .build() ...

Get Scala Machine Learning Projects now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.