The workings of neural architecture search

The NN models we've discussed so far were designed by their authors. But, what if we could make the computer itself design the NN? Enter neural architecture search (NAS)—a technique that automates the design of NNs.

Before we continue, let's see what the network architecture consists of:
  • The graph of operations, which represents the network. As we discussed in Chapter 1, The Nuts and bolts of Neural Networks, the operations include (but are not limited to) convolutions, activation functions, fully connected layers, normalization, and so on.
  • The parameters of each operation. For example, the convolution parameters are: type (cross-channel, depthwise, and so on), input dimensions, number of input ...

Get Advanced Deep Learning with Python now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.