Choosing the most appropriate activation function

Using keras, you can use a number of different activation functions. Some of these have been discussed in previous chapters; however, there are some that have not been previously covered. We can begin by listing the ones we have already covered with a quick note on each function:

  • Linear: Also known as the identity function. Uses the value of x.
  • Sigmoid: Uses 1 divided by 1 plus the exponent of negative x.
  • Hyperbolic tangent (tanh): Uses the exponent of x minus the exponent of negative x divided by x plus the exponent of negative x. This has the same shape as the sigmoid function; however, the range along the y-axis goes from 1 to -1 instead of from 1 to 0.
  • Rectified Linear Units (ReLU): Uses ...

Get Hands-On Deep Learning with R now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.