October 2018
Intermediate to advanced
368 pages
9h 20m
English

Figure 2.4.1: A 4-layer Dense block in DenseNet. The input to each layer is made of all the previous feature maps.
DenseNet attacks the problem of vanishing gradient using a different approach. Instead of using shortcut connections, all the previous feature maps will become the input of the next layer. The preceding figure, shows an example of a dense interconnection in one Dense block.
For simplicity, in this figure, we'll only show four layers. Notice that the input to layer l is the concatenation of all previous feature maps. If we designate the BN-ReLU-Conv2D as the operation H(x), then the output ...