Skip to Main Content
Practical Neural Network Recipies in C++
book

Practical Neural Network Recipies in C++

by Masters
June 2014
Intermediate to advanced content levelIntermediate to advanced
493 pages
20h 30m
English
Morgan Kaufmann
Content preview from Practical Neural Network Recipies in C++
72
Chapter 5
It is a profound testimony to the approximating power of
feedforward networks that such a close and smooth fit is able to be
obtained by using only two hidden neurons. The relatively large error
that occurs near
(0,
1)
is partially due to the nonlinearity of the output
neuron's activation function. When feedforward networks are used for
function approximation where serious noise problems are not expected,
the output neurons are usually given linear activation functions.
Alternatively, if it is difficult or impossible to change the activation
function due to program limitations, the data can be scaled to cover a
narrow interval
Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Neural Networks with Keras Cookbook

Neural Networks with Keras Cookbook

V Kishore Ayyadevara
Scientific Computing with Python 3

Scientific Computing with Python 3

Claus Führer, Claus Fuhrer, Jan Erik Solem, Olivier Verdier
Mastering Java Machine Learning

Mastering Java Machine Learning

Uday Kamath, Krishna Choppella

Publisher Resources

ISBN: 9780080514338