Machine Learning: AdaBoost

Rahul S
2 min readSep 5, 2023

AdaBoost (Adaptive Boosting) is a binary classification ensemble learning algorithm that focuses on data points misclassified by the current ensemble.

It iteratively trains weak classifiers, assigns weights to data points, and combines them into a strong classifier. Its key steps:

  1. Initialization: Set equal weights for all training data points.
  2. Iterative Training: For each iteration, train a weak classifier, often a decision stump, on weighted data. It emphasizes misclassified data from the previous round.
  3. Classifier Weight Calculation: Calculate the weighted error of the weak classifier and compute its weight in the ensemble. Higher weight for better performance.
  4. Weight Update: Adjust data point weights based on the correctness of classification by the current weak classifier. Increase weights for misclassified points, decrease for correct ones.
  5. Ensemble Creation: Combine weak classifiers into an ensemble, each with its weight.
  6. Normalization of Weights: Ensure data point weights form a probability distribution.
  7. Final Classification: Compute the final prediction as the sign of the weighted sum of predictions from weak classifiers.
  8. Repeat Iterations: Continue iterations until a preset limit or…

--

--