Neural Networks and Deep Learning
This chapter deals with neural networks (NN), starting from the early days of the perceptron and perceptron rule, then moves on to review multilayer feed-forward neural networks and the backpropagation algorithm. The drawbacks of training NN with many layers, via the backpropagation algorithm, are discussed together with the advantages that one would expect to obtain if such networks could be trained efficiently. Restricted Boltzmann machines (RBM) are then discussed and the contrastive divergence algorithm is presented as the vehicle to pre-train deep/many layer NN architectures. Deep belief networks, conditional RBMs and autoencoders are also discussed. Finally, two case studies are ...
Get Machine Learning now with O’Reilly online learning.
O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.