What is the difference between bagging and boosting?
Bagging (Bootstrap Aggregating): Trains multiple independent models (e.g., decision trees) on different subsets of the data and combines their predictions to improve accuracy. Random Forest is an example of a bagging algorithm.
Boosting: Sequentially trains models, where each model corrects the errors of the previous one. Boosting combines weak models to create a strong learner, with algorithms like AdaBoost and Gradient Boosting being popular examples.