Skip to Content
Numerical Computing with Python
book

Numerical Computing with Python

by Pratap Dangeti, Allen Yu, Claire Chung, Aldrin Yim
December 2018
Beginner to intermediate
682 pages
18h 1m
English
Packt Publishing
Content preview from Numerical Computing with Python

Extreme gradient boosting - XGBoost classifier

XGBoost is the new algorithm developed in 2014 by Tianqi Chen based on the Gradient boosting principles. It has created a storm in the data science community since its inception. XGBoost has been developed with both deep consideration in terms of system optimization and principles in machine learning. The goal of the library is to push the extremes of the computation limits of machines to provide scalable, portable, and accurate results:

# Xgboost Classifier>>> import xgboost as xgb>>> xgb_fit = xgb.XGBClassifier(max_depth=2, n_estimators=5000, learning_rate=0.05)>>> xgb_fit.fit(x_train, y_train)>>> print ("\nXGBoost - Train Confusion Matrix\n\n",pd.crosstab(y_train, xgb_fit.predict(x_train),rownames ...
Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Mastering Numerical Computing with NumPy

Mastering Numerical Computing with NumPy

Umit Mert Cakmak, Tiago Antao, Mert Cuhadaroglu

Publisher Resources

ISBN: 9781789953633OtherOtherErrata Page