December 2018
Beginner to intermediate
330 pages
8h 32m
English
k-nearest neighbors (KNN) is an example of a non-parametric model, meaning it has no parameters to be learned from data; however, it has one very important hyperparameter—the number of neighbors. Back in Chapter 4, Predicting Numerical Values with Machine Learning, we used a KNN model with 12 neighbors in our diamond prices dataset, and that model gave us the best result when compared with multiple linear regression and lasso. We used 12 because I thought it was a good value, however, there is no guarantee it is the best value; why not nine or 15 neighbors? Maybe 10 works best. That is what the activity of hyperparameter optimization is all about—finding the set of best hyperparameters, or, more often, a ...