6 Common design building blocks

This chapter covers

  • Adding new activation functions
  • Inserting new layers to improve training
  • Skipping layers as a useful design pattern
  • Combining new activations, layers, and skips into new approaches more powerful than the sum of their parts

At this point, we have learned about the three most common and fundamental types of neural networks: fully connected, convolutional, and recurrent. We have improved all of these architectures by changing the optimizer and learning rate schedule, which alter how we update the parameters (weights) of our models, giving us more accurate models almost for free. All of the things we have learned thus far also have a long shelf life and have taught us about problems that have ...

Get Inside Deep Learning now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.