In this section, we will code up the process of visualizing what the filters are learning across the convolution filters of the initial layers as well as the final layers.
We'll reuse the data that we prepared in the Gender classification using CNN recipe's Scenario 1 from step 1 to step 4 (please refer to Transfer_learning.ipynb file in GitHub while implementing the code):
- Identify an image for which you want to visualize the intermediate output:
plt.imshow(x[3])plt.grid('off')
- Define the functional API that takes the image as an input, and the first convolution layer's output as output:
from keras.applications.vgg16 import ...