Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Principle:Scikit learn contrib Imbalanced learn Sensitivity Specificity Analysis

From Leeroopedia


Knowledge Sources
Domains Machine_Learning, Model_Evaluation, Imbalanced_Learning
Last Updated 2026-02-09 03:00 GMT

Overview

A pair of complementary evaluation metrics that separately measure a classifier's ability to correctly identify positive cases (sensitivity) and negative cases (specificity), providing a complete picture of classification behavior on imbalanced data.

Description

Sensitivity (also called recall or true positive rate) measures the proportion of actual positives correctly identified: TP / (TP + FN). Specificity (true negative rate) measures the proportion of actual negatives correctly identified: TN / (TN + FP). Together, they reveal whether a classifier is biased toward one class.

Standard accuracy conflates these two aspects. Reporting them separately is essential for imbalanced classification where the cost of false negatives and false positives may differ significantly.

Usage

Use these metrics when you need to understand both positive and negative class recognition separately, especially in medical, fraud detection, or other domains where false negative/positive costs are asymmetric.

Theoretical Basis

Sensitivity=TPTP+FN

Specificity=TNTN+FP

The internal sensitivity_specificity_support function computes both values simultaneously along with per-class support counts.

Related Pages

Implemented By

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment