Activation functions are used to introduce nonlinearity in neural networks. We apply the activation function to the input which is multiplied by weights and added to the bias, that is, f(z), where z = (input * weights) + bias. There are different types of activation functions as follows:
- Sigmoid function: The sigmoid function is one of the most commonly used activation functions. It scales the value between 0 and 1. The sigmoid function can be defined as . When we apply this function to z, the values will be scaled in the range of 0 to 1. This is also called a logistic function. It is s-shaped, as shown in the following ...