Jump to content
 







Main menu
   


Navigation  



Main page
Contents
Current events
Random article
About Wikipedia
Contact us
Donate
 




Contribute  



Help
Learn to edit
Community portal
Recent changes
Upload file
 








Search  

































Create account

Log in
 









Create account
 Log in
 




Pages for logged out editors learn more  



Contributions
Talk
 



















Contents

   



(Top)
 


1 Overview  





2 Gradient-based one-side sampling  





3 Exclusive feature bundling  





4 See also  





5 References  





6 Further reading  





7 External links  














LightGBM







 

Edit links
 









Article
Talk
 

















Read
Edit
View history
 








Tools
   


Actions  



Read
Edit
View history
 




General  



What links here
Related changes
Upload file
Special pages
Permanent link
Page information
Cite this page
Get shortened URL
Download QR code
Wikidata item
 




Print/export  



Download as PDF
Printable version
 
















Appearance
   

 






From Wikipedia, the free encyclopedia
 


LightGBM
Original author(s)Guolin Ke[1] / Microsoft Research
Developer(s)Microsoft and LightGBM contributors[2]
Initial release2016; 8 years ago (2016)
Stable release

v4.3.0[3] / January 15, 2024; 5 months ago (2024-01-15)

Repositorygithub.com/microsoft/LightGBM
Written inC++, Python, R, C
Operating systemWindows, macOS, Linux
TypeMachine learning, gradient boosting framework
LicenseMIT License
Websitelightgbm.readthedocs.io

LightGBM, short for Light Gradient-Boosting Machine, is a free and open-source distributed gradient-boosting framework for machine learning, originally developed by Microsoft.[4][5] It is based on decision tree algorithms and used for ranking, classification and other machine learning tasks. The development focus is on performance and scalability.

Overview[edit]

The LightGBM framework supports different algorithms including GBT, GBDT, GBRT, GBM, MART[6][7] and RF.[8] LightGBM has many of XGBoost's advantages, including sparse optimization, parallel training, multiple loss functions, regularization, bagging, and early stopping. A major difference between the two lies in the construction of trees. LightGBM does not grow a tree level-wise — row by row — as most other implementations do.[9] Instead it grows trees leaf-wise. It chooses the leaf it believes will yield the largest decrease in loss.[citation needed] Besides, LightGBM does not use the widely used sorted-based decision tree learning algorithm, which searches the best split point on sorted feature values,[10]asXGBoost or other implementations do. Instead, LightGBM implements a highly optimized histogram-based decision tree learning algorithm, which yields great advantages on both efficiency and memory consumption.[11] The LightGBM algorithm utilizes two novel techniques called Gradient-Based One-Side Sampling (GOSS) and Exclusive Feature Bundling (EFB) which allow the algorithm to run faster while maintaining a high level of accuracy.[12]

LightGBM works on Linux, Windows, and macOS and supports C++, Python,[13] R, and C#.[14] The source code is licensed under MIT License and available on GitHub.[15]

Gradient-based one-side sampling[edit]

Gradient-based one-side sampling (GOSS) is a method that leverages the fact that there is no native weight for data instance in GBDT. Since data instances with different gradients play different roles in the computation of information gain, the instances with larger gradients will contribute more to the information gain. So to retain the accuracy of the information, GOSS keeps the instances with large gradients and randomly drops the instances with small gradients.[12]

Exclusive feature bundling[edit]

Exclusive feature bundling (EFB) is a near-lossless method to reduce the number of effective features. In a sparse feature space many features are nearly exclusive, implying they rarely take nonzero values simultaneously. One-hot encoded features are a perfect example of exclusive features. EFB bundles these features, reducing dimensionality to improve efficiency while maintaining a high level of accuracy. The bundle of exclusive features into a single feature is called an exclusive feature bundle.[12]

See also[edit]

References[edit]

  1. ^ "Guolin Ke". GitHub.
  • ^ "microsoft/LightGBM". GitHub. 7 July 2022.
  • ^ "Releases · microsoft/LightGBM". GitHub.
  • ^ Brownlee, Jason (March 31, 2020). "Gradient Boosting with Scikit-Learn, XGBoost, LightGBM, and CatBoost".
  • ^ Kopitar, Leon; Kocbek, Primoz; Cilar, Leona; Sheikh, Aziz; Stiglic, Gregor (July 20, 2020). "Early detection of type 2 diabetes mellitus using machine learning-based prediction models". Scientific Reports. 10 (1): 11981. Bibcode:2020NatSR..1011981K. doi:10.1038/s41598-020-68771-z. PMC 7371679. PMID 32686721 – via www.nature.com.
  • ^ "Understanding LightGBM Parameters (and How to Tune Them)". neptune.ai. May 6, 2020.
  • ^ "An Overview of LightGBM". avanwyk. May 16, 2018.
  • ^ "Parameters — LightGBM 3.0.0.99 documentation". lightgbm.readthedocs.io.
  • ^ The Gradient Boosters IV: LightGBM – Deep & Shallow
  • ^ Manish, Mehta; Rakesh, Agrawal; Jorma, Rissanen (Nov 24, 2020). "SLIQ: A fast scalable classifier for data mining". International Conference on Extending Database Technology: 18–32. CiteSeerX 10.1.1.89.7734.
  • ^ "Features — LightGBM 3.1.0.99 documentation". lightgbm.readthedocs.io.
  • ^ a b c Ke, Guolin; Meng, Qi; Finley, Thomas; Wang, Taifeng; Chen, Wei; Ma, Weidong; Ye, Qiwei; Liu, Tie-Yan (2017). "LightGBM: A Highly Efficient Gradient Boosting Decision Tree". Advances in Neural Information Processing Systems. 30.
  • ^ "lightgbm: LightGBM Python Package". 7 July 2022 – via PyPI.
  • ^ "Microsoft.ML.Trainers.LightGbm Namespace". docs.microsoft.com.
  • ^ "microsoft/LightGBM". October 6, 2020 – via GitHub.
  • Further reading[edit]

    External links[edit]


    Retrieved from "https://en.wikipedia.org/w/index.php?title=LightGBM&oldid=1230619918"

    Categories: 
    Applied machine learning
    Data mining and machine learning software
    Free and open-source software
    Microsoft free software
    Microsoft Research
    Open-source artificial intelligence
    Software using the MIT license
    2016 software
    Hidden categories: 
    Articles with short description
    Short description is different from Wikidata
    All articles with unsourced statements
    Articles with unsourced statements from June 2024
     



    This page was last edited on 23 June 2024, at 19:07 (UTC).

    Text is available under the Creative Commons Attribution-ShareAlike License 4.0; additional terms may apply. By using this site, you agree to the Terms of Use and Privacy Policy. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc., a non-profit organization.



    Privacy policy

    About Wikipedia

    Disclaimers

    Contact Wikipedia

    Code of Conduct

    Developers

    Statistics

    Cookie statement

    Mobile view



    Wikimedia Foundation
    Powered by MediaWiki