July 2016
Beginner to intermediate
462 pages
9h 14m
English
If we are fitting data to a straight line, the parameters of the mathematical model will be the slope and intercept of the line. When we determine the parameters of a model, we fit the model on a subset of the data (training set), and we evaluate the performance of the model on the rest of the data (test set). This is called
validation and there are more elaborate schemes. The scikit-learn GridSearchCV class uses k-fold cross-validation, for example.
Classifiers and regressors usually require extra parameters (hyperparameters) such as the number of components of an ensemble, which usually have nothing to do with the linear model as mentioned in the first sentence. It's a bit confusing to talk about models because we have ...