- First, as we did in an earlier neural network model, we need to instantiate the Sequential model type and add the first dense layer, as follows:
def dc_model(self): model = Sequential() model.add(Dense(256*8*8,activation=LeakyReLU(0.2), input_dim=self.LATENT_SPACE_SIZE)) model.add(BatchNormalization()
This first dense layer represents the input layer with a LeakyReLU activation. Also, notice that the input is the latent space size. The first number in the dense layer is the number of filters initialized; however, in the future, you may want to experiment with changing the network's starting value. The paper here starts with 1,024 filters, but starting out with more can help the GAN's convergence.
- Next, we need ...