This chapter will discuss the neural network approximation of noncontinuous functions. Currently, this is a problematic area for neural networks because network processing is based on calculating partial function derivatives (using the gradient descent algorithm), and calculating them for noncontinuous functions at the points where the function value suddenly jump or drop leads to questionable results. We will dig deeper into this issue in this chapter. The chapter also includes a method I developed that solves this issue.
8. Approximating Noncontinuous Functions
Get Artificial Neural Networks with Java: Tools for Building Neural Network Applications now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.