Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Principle:Scikit learn contrib Imbalanced learn Random Under Sampling Boosting

From Leeroopedia


Knowledge Sources
Domains Machine_Learning, Ensemble_Learning, Imbalanced_Learning
Last Updated 2026-02-09 03:00 GMT

Overview

A boosting algorithm that integrates random under-sampling into each iteration of AdaBoost to handle class imbalance during the boosting process.

Description

RUSBoost modifies the AdaBoost algorithm by applying random under-sampling at each boosting iteration before training the weak learner. This combines the benefits of data-level resampling with the algorithmic advantages of boosting, producing a classifier that is both cost-sensitive and focused on difficult examples.

Usage

Use this principle when boosting is preferred and the dataset is too imbalanced for standard AdaBoost to perform well.

Theoretical Basis

At each boosting iteration t:

  1. Apply random under-sampling to balance the weighted training data
  2. Train weak learner on the balanced subset
  3. Update sample weights based on misclassification
  4. Aggregate via weighted voting

Related Pages

Implemented By

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment