Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Principle:Scikit learn contrib Imbalanced learn Balanced Bagging

From Leeroopedia


Knowledge Sources
Domains Machine_Learning, Ensemble_Learning, Imbalanced_Learning
Last Updated 2026-02-09 03:00 GMT

Overview

A bagging ensemble method that resamples each bootstrap to balance class distributions before training each base estimator.

Description

Balanced Bagging extends standard bagging by inserting a resampling step before each base estimator's training. By default, RandomUnderSampler is used, but any sampler can be plugged in. This allows implementing variants such as Exactly Balanced Bagging, Roughly Balanced Bagging, Over-Bagging, and SMOTE-Bagging by changing the sampler.

Usage

Use this principle when bagging with any base estimator (not just trees) and you want per-bag balancing with a configurable sampler.

Theoretical Basis

For each base estimator in the ensemble:

  1. Draw a bootstrap sample from the training data
  2. Apply a sampler (default: under-sampling) to balance the bootstrap
  3. Train the base estimator on the balanced bootstrap

Related Pages

Implemented By

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment