December 2018
Beginner to intermediate
684 pages
21h 9m
English
For illustration purposes, we also train a LightGBM gradient-boosting tree ensemble with default settings and the multiclass objective:
param = {'objective':'multiclass', 'num_class': 5}booster = lgb.train(params=param, train_set=lgb_train, num_boost_round=500, early_stopping_rounds=20, valid_sets=[lgb_train, lgb_test])
The basic settings do not improve on multinomial logistic regression, but further parameter tuning remains an unused option:
y_pred_class = booster.predict(test_dtm_numeric.astype(float))accuracy_score(test.stars, y_pred_class.argmax(1) + 1)0.738665855696524