We create a sequential model with the following layers:
- Dense layer with an input of (self.latent_dim) and output of (*, 256 units)
- The leaky ReLU layer applies this function to incoming data
- Batch normalization: normalizes the data
- Dense layer of 512: layer with output of (*, 512 units)
- Batch normalization
- Dense layer of (*, 1024)
- Leaky RELU
- Batch normalization
- Dense layer of size (*, 256) with activation tanh
- Reshape back to img_shape
- Add some noise to the model of type shape=(self.latent_dim,):
def build_generator(self):model = Sequential()model.add(Dense(256, input_dim=self.latent_dim))model.add(LeakyReLU(alpha=0.2))model.add(BatchNormalization(momentum=0.8))model.add(Dense(512))model.add(LeakyReLU(alpha=0.2)) ...