December 2018
Intermediate to advanced
421 pages
10h 41m
English
The previously discussed architecture of ANNs is called FC neural networks (FCNNs). The reason is that each neuron in a layer i is connected to all neurons in layers i-1 and i+1. Each connection between two neurons has two parameters: the weight and the bias. Adding more layers and neurons increases the number of parameters. As a result, it is very time-consuming to train such networks even on devices on multiple graphics processing units (GPUs) and multiple central processing units (CPUs). It becomes impossible to train such networks ...