LightGBM

When your dataset contains a large number of cases or variables, even if XGBoost is compiled from C++, it really takes a long time to train. Therefore, in spite of the success of XGBoost, there was space in January 2017 for another algorithm to appear (XGBoost's first appearance is dated March 2015). It was the high-performance LightGBM, capable of being distributed and fast-handling large amounts of data, and developed by a team at Microsoft as an open source project.

Here is its GitHub page: https://github.com/Microsoft/LightGBM. And, here is the academic paper illustrating the idea behind the algorithm: https://papers.nips.cc/paper/6907-lightgbm-a-highly-efficient-gradient-boosting-decision-tree.

LightGBM is based on decision ...

Get Python Data Science Essentials - Third Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.