Convolution neural networks

This section is provided as a brief introduction to convolution neural networks without the Scala implementation.

So far, the layers of perceptrons were organized as a fully connected network. It is clear that the number of synapses or weights increases significantly as the number and size of hidden layers increases. For instance, a network for a features set of dimension 6, 3 hidden layers of 64 nodes each, and one output value requires 7*64 + 2*65*64 + 65*1 = 8833 weights!

Applications such as image or character recognition require very large features set, making training a fully connected layered perceptron very computational intensive. Moreover, these applications need to convey spatial information such as the proximity ...

Get Scala:Applied Machine Learning now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.