July 2017
Intermediate to advanced
382 pages
9h 13m
English
Returning to our k-NN classifier, we find that we have only one hyperparameter to tune: k. Typically, you would have a much larger number of open parameters to mess with, but the k-NN algorithm is simple enough for us to manually implement grid search.
Before we get started, we need to split the dataset as we have done before into training and test sets. Here we choose a 75-25 split:
In [1]: from sklearn.datasets import load_iris... import numpy as np... iris = load_iris()... X = iris.data.astype(np.float32)... y = iris.targetIn [2]: X_train, X_test, y_train, y_test = train_test_split(... X, y, random_state=37... )
Then the goal is to loop over all possible values of k. As we do this, we want to keep track ...
Read now
Unlock full access