We can now build an architecture with two dense layers and train a classifier for a more complex dataset. Let's start by creating it:
from sklearn.datasets import make_classificationnb_samples = 1000nb_features = 3X, Y = make_classification(n_samples=nb_samples, n_features=nb_features, n_informative=3, n_redundant=0, n_classes=2, n_clusters_per_class=3)
Even if we have only two classes, the dataset has three features and three clusters per class; therefore, it's almost impossible that a linear classifier can separate it with very high accuracy. A plot of the dataset is shown in the following graph:
For benchmarking purposes, it's useful to test a ...