Classification with a multi-layer perceptron

We can now build an architecture with two dense layers and train a classifier for a more complex dataset. Let's start by creating it:

from sklearn.datasets import make_classification>>> nb_samples = 1000>>> nb_features = 3>>> X, Y = make_classification(n_samples=nb_samples, n_features=nb_features, >>> n_informative=3, n_redundant=0, n_classes=2, n_clusters_per_class=3)

Even if we have only two classes, the dataset has three features and three clusters per class; therefore it's almost impossible that a linear classifier can separate it with very high accuracy. A plot of the dataset is shown in the following figure:

For benchmarking purposes, it's useful to test a logistic regression:

from sklearn.model_selection ...

Get Machine Learning Algorithms now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.