Machine Learning: XGBoost — An intuition

Rahul S
2 min readSep 4, 2023

XGBoost, or Extreme Gradient Boosting, is an ensemble learning algorithm primarily based on gradient boosting and optimization principles. It builds a strong predictive model by combining the predictions of multiple weak learners, often decision trees, through an iterative process.

1. HOW IT WORKS

Here’s a concise technical breakdown of how XGBoost works:

  1. Gradient Boosting: XGBoost follows a boosting approach where each new model corrects the errors of the previous ones, leading to incremental performance improvement.
  2. Loss Function: It minimizes a loss function that quantifies the disparity between predicted and actual values, using common loss functions such as mean squared error (for regression) and log loss (for classification).
  3. Gradient Descent: XGBoost employs gradient descent to minimize the loss function. It calculates the gradient of the loss concerning the current model’s predictions.
  4. Additive Learning: At each boosting iteration, a new decision tree (weak learner) is added to the ensemble. This tree aims to minimize the residual errors left by the previous trees.
  5. Weighted Updates: XGBoost assigns weights to data points, giving higher weights to those that are harder to predict (higher residual errors)…

--

--