Tuning Hyperparameters
We built a neural network and prepared its input data, but that was only the beginning. The same algorithm running on the same data can yield wildly different results, depending on hyperparameters such as the learning rate and the number of hidden nodes.
ML development is mostly about finding good values for those hyperparameters. Compared to software development, that task can look like a form of black magic: there is no hard-and-fast rule that tells you how to set those hyperparameters. In fact, I chickened out of the issue whenever it came up, offering vague advice like: “try different values for the hyperparameters, and see which ones work better.”
In this section, I’ll give you some more concrete guidelines. Here’s ...
Get Programming Machine Learning now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.