Chapter 10. Neural Architecture Search
Neural architecture search (NAS) is a technique for automating the design of neural networks. By running through a number of architecture permutations, NAS allows us to determine the most optimal architecture for a given problem. Models found by NAS are often on par with, or outperform, hand-designed architectures for many types of problems. It has recently been a very active area of both research and practical application.
The goal of NAS is to find an optimal model architecture. Keep in mind that modern neural networks cover a huge parameter space, so automating the search with tools like automated machine learning (AutoML) makes a lot of sense, but it can be very demanding of compute resources.
In this chapter, we will introduce techniques to optimize your ML models, starting with hyperparameter tuning, NAS, and AutoML. At the end of this chapter, we will introduce cloud services for AutoML.
Hyperparameter Tuning
Before taking a deep dive into NAS, let’s understand the problem it solves by analyzing one of the most tedious processes in ML modeling (if done naively): hyperparameter tuning. As we think you’ll see, there are similarities between hyperparameter tuning and NAS. We’re going to assume that you’re already familiar with hyperparameter tuning, so we’re not going to go into great detail in this section. Rather, we will help you understand the similarities between hyperparameter tuning and NAS.
In ML models, there are two types of ...
Get Machine Learning Production Systems now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.