O'Reilly logo

R Deep Learning Projects by Pablo Maldonado, Yuxi Liu

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Multi-layer perceptron

Like pancakes, neural networks are made to be stacked with each other. We can make the output of a layer the input of the next layer, called a hidden layer. This hidden layer consists of a linear combination of the inputs to which an activation function is applied. This creates a new hidden vector which we can take as an input for the following hidden layer, on each step recombining the outputs from the previous layer by some weights and applying an activation function. 

Let's start by introducing the sigmoid function, which will be useful later:

library(R6)sigmoid <- function(x){  1/(1+exp(-x))}

The skeleton of the class is now:

MLP <- R6Class("MLP",                       public = list(                        dim = NULL,                        n_iter = NULL,                        learning_rate = NULL,

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required