Answer Posted / Pankaj Balodi
Bagging (Bootstrap Aggregating) and Boosting are two popular ensemble learning techniques. Bagging trains multiple models on different subsets of the training data created by random sampling with replacement, reducing correlation among the individual models. Boosting trains multiple weak learners sequentially where each new model is trained to correct the errors made by the previous models.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers