XGBoost Vs LightGBM
XGBOOST Algorithm:
A very popular and in-demand algorithm often referred to as the winning algorithm for various competitions on different platforms. XGBOOST stands for Extreme Gradient Boosting. This algorithm is an improved version of the Gradient Boosting Algorithm. The base algorithm is Gradient Boosting Decision Tree Algorithm. Its powerful predictive power and easy to implement approach has made it float throughout many machine learning notebooks. Some key points of the algorithm are as follows:
Some parameters which can be tuned to increase the performance are as follows:
General Parameters include the following:
Booster Parameters include the following:
Learning Task Parameters include the following:
1) objective: This will define the loss function which is to be used.
Recommended by LinkedIn
Light Gradient Boosting Machine:
LGBM is a quick, distributed, and high-performance gradient lifting framework which is based upon a popular machine learning algorithm – Decision Tree. It can be used in classification, regression, and many more machine learning tasks. This algorithm grows leaf wise and chooses the maximum delta value to grow. LightGBM uses histogram-based algorithms. The advantages of this are as follows:
So as LightGBM gets trained much faster but also it can lead to the case of overfitting sometimes. So, let us see what parameters can be tuned to get a better optimal model.
To get the best fit following parameters must be tuned:
For Achieving Better Accuracy following parameters must be tuned: