April 2015
Intermediate to advanced
1062 pages
40h 35m
English
This chapter deals with neural networks (NN), starting from the early days of the perceptron and perceptron rule, then moves on to review multilayer feed-forward neural networks and the backpropagation algorithm. The drawbacks of training NN with many layers, via the backpropagation algorithm, are discussed together with the advantages that one would expect to obtain if such networks could be trained efficiently. Restricted Boltzmann machines (RBM) are then discussed and the contrastive divergence algorithm is presented as the vehicle to pre-train deep/many layer NN architectures. Deep belief networks, conditional RBMs and autoencoders are also discussed. Finally, two case studies are ...