Predicting ad click-through with logistic regression using gradient descent

After a brief example, we now deploy the algorithm we just developed in our click-through prediction project.

We herein start with only 10,000 training samples (you will soon see why we don't start with 270,000, as we did in the previous chapter):

>>> import pandas as pd>>> n_rows = 300000>>> df = pd.read_csv("train", nrows=n_rows)>>> X = df.drop(['click', 'id', 'hour', 'device_id', 'device_ip'],                                                      axis=1).values>>> Y = df['click'].values>>> n_train = 10000>>> X_train = X[:n_train]>>> Y_train = Y[:n_train]>>> X_test = X[n_train:]>>> Y_test = Y[n_train:]>>> from sklearn.preprocessing import OneHotEncoder>>> enc = OneHotEncoder(handle_unknown='ignore')>>> X_train_enc ...

Get Python Machine Learning By Example - Second Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.