Principle:Scikit learn contrib Imbalanced learn Random Under Sampling Boosting
| Knowledge Sources | |
|---|---|
| Domains | Machine_Learning, Ensemble_Learning, Imbalanced_Learning |
| Last Updated | 2026-02-09 03:00 GMT |
Overview
A boosting algorithm that integrates random under-sampling into each iteration of AdaBoost to handle class imbalance during the boosting process.
Description
RUSBoost modifies the AdaBoost algorithm by applying random under-sampling at each boosting iteration before training the weak learner. This combines the benefits of data-level resampling with the algorithmic advantages of boosting, producing a classifier that is both cost-sensitive and focused on difficult examples.
Usage
Use this principle when boosting is preferred and the dataset is too imbalanced for standard AdaBoost to perform well.
Theoretical Basis
At each boosting iteration t:
- Apply random under-sampling to balance the weighted training data
- Train weak learner on the balanced subset
- Update sample weights based on misclassification
- Aggregate via weighted voting