Picking good activation functions makes training much easier. Also, the activation function choice may entail at least two shoulds: the way you should transform data and the way binaries (if there are any) should be formatted. There is an infinity of activation functions available; actually, it could be any continuous function—you can make one of your own.
Here is are a list of popular activation functions:
- Rectified Linear Unit (ReLU):