What is the difference between bagging and boosting in ensemble learning?
January 9, 2025
Bagging trains multiple models independently and combines their results, reducing variance and overfitting (e.g., Random Forest).
Boosting trains models sequentially, with each new model correcting the errors of the previous one, focusing on
reducing bias (e.g., AdaBoost, Gradient Boosting).