O'Reilly logo

Hands-On Reinforcement Learning with Python by Sudharsan Ravichandiran

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Activation functions

Activation functions are used to introduce nonlinearity in neural networks. We apply the activation function to the input which is multiplied by weights and added to the bias, that is, f(z), where z = (input * weights) + bias. There are different types of activation functions as follows:

  • Sigmoid function: The sigmoid function is one of the most commonly used activation functions. It scales the value between 0 and 1. The sigmoid function can be defined as . When we apply this function to z, the values will be scaled in the range of 0 to 1. This is also called a logistic function. It is s-shaped, as shown in the following diagram:  ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required