Skip to Main Content
Hands-On Reinforcement Learning with Python
book

Hands-On Reinforcement Learning with Python

by Sudharsan Ravichandiran
June 2018
Intermediate to advanced content levelIntermediate to advanced
318 pages
9h 24m
English
Packt Publishing
Content preview from Hands-On Reinforcement Learning with Python

Activation functions

Activation functions are used to introduce nonlinearity in neural networks. We apply the activation function to the input which is multiplied by weights and added to the bias, that is, f(z), where z = (input * weights) + bias. There are different types of activation functions as follows:

  • Sigmoid function: The sigmoid function is one of the most commonly used activation functions. It scales the value between 0 and 1. The sigmoid function can be defined as . When we apply this function to z, the values will be scaled in the range of 0 to 1. This is also called a logistic function. It is s-shaped, as shown in the following ...
Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Advanced Deep Learning with Python

Advanced Deep Learning with Python

Ivan Vasilev

Publisher Resources

ISBN: 9781788836524Supplemental Content