An example

We now look at a practical example, containing what we've seen so far in this chapter.

Our dataset is an artificially created one, composed of 10,000 observations and 10 features, all of them informative (that is, no redundant ones) and labels "0" and "1" (binary classification). Having all the informative features is not an unrealistic hypothesis in machine learning, since usually the feature selection or feature reduction operation selects non-related features.

In:
X, y = make_classification(n_samples=10000, n_features=10,
                           n_informative=10, n_redundant=0,
                           random_state=101)

Now, we'll show you how to use different libraries, and different modules, to perform the classification task, using logistic regression. We won't focus here on ...

Get Regression Analysis with Python now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.