Appendix B. Neural Network Classes

Building on the foundations from Appendix A, this appendix provides simple, class-based implementations of neural networks that mimic the APIs of packages such as scikit-learn. The implementation is based on pure, simple Python code and is for illustration and instruction. The classes presented in this appendix cannot replace robust, efficient, and scalable implementations found in the standard Python packages, such as scikit-learn or TensorFlow in combination with Keras.

The appendix comprises the following sections:

The implementations and examples in this appendix are simple and straightforward. The Python classes are not well suited to attack larger estimation or classification problems. The idea is rather to show easy-to-understand Python implementations from scratch.

Activation Functions

Appendix A uses two activation functions implicitly or explicitly: linear function and sigmoid function. The Python function activation adds the relu (rectified linear unit) and softplus functions to the set of options. For all these activation functions, the first derivative is also defined:

In [1]: import

Get Artificial Intelligence in Finance now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.