Implementation:DistrictDataLabs Yellowbrick PrecisionRecallCurve Visualizer
| Knowledge Sources | |
|---|---|
| Domains | Machine_Learning, Classification, Visualization |
| Last Updated | 2026-02-08 00:00 GMT |
Overview
Concrete tool for visualizing the precision-recall tradeoff across decision thresholds for binary and multiclass classifiers, provided by the Yellowbrick library.
Description
The PrecisionRecallCurve class (also aliased as PRCurve) is a classification score visualizer that computes and plots precision-recall curves. For binary classification, it produces a single curve with optional area fill and average precision annotation. For multiclass problems, it wraps the estimator in a OneVsRestClassifier and can display per-class curves, a micro-averaged curve, or both. The visualizer supports ISO F1 curve overlays, configurable fill and line opacities, and custom color schemes via explicit color lists or colormaps.
The companion quick method precision_recall_curve() provides a one-call interface that instantiates, fits, scores, and renders the visualizer.
Usage
Use PrecisionRecallCurve when evaluating classifiers on imbalanced datasets or when the cost of false positives and false negatives is asymmetric. Import it when you need to assess precision-recall tradeoffs visually, compare models, or present results with average precision scores.
Code Reference
Source Location
- Repository: yellowbrick
- File: yellowbrick/classifier/prcurve.py
- Class Lines: L211-345 (PrecisionRecallCurve class)
- Quick Method Lines: L485-678 (precision_recall_curve function)
Signature
class PrecisionRecallCurve(ClassificationScoreVisualizer):
def __init__(
self,
estimator,
ax=None,
classes=None,
colors=None,
cmap=None,
encoder=None,
fill_area=True,
ap_score=True,
micro=True,
iso_f1_curves=False,
iso_f1_values=(0.2, 0.4, 0.6, 0.8),
per_class=False,
fill_opacity=0.2,
line_opacity=0.8,
is_fitted="auto",
force_model=False,
**kwargs
)
def fit(self, X, y=None)
def score(self, X, y)
def precision_recall_curve(
estimator,
X_train,
y_train,
X_test=None,
y_test=None,
ax=None,
classes=None,
colors=None,
cmap=None,
encoder=None,
fill_area=True,
ap_score=True,
micro=True,
iso_f1_curves=False,
iso_f1_values=(0.2, 0.4, 0.6, 0.8),
per_class=False,
fill_opacity=0.2,
line_opacity=0.8,
is_fitted="auto",
force_model=False,
show=True,
**kwargs
)
Import
from yellowbrick.classifier import PrecisionRecallCurve
from yellowbrick.classifier.prcurve import precision_recall_curve
I/O Contract
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
| estimator | sklearn classifier | Yes | A scikit-learn classifier with predict_proba or decision_function method |
| ax | matplotlib Axes | No | Axes object on which to draw the curve; uses current axes if not provided |
| classes | list of str | No | Human-readable class labels for the legend |
| colors | list of str | No | Explicit list of colors for per-class curves; overrides cmap if both provided |
| cmap | str or Matplotlib colormap | No | Colormap for per-class curve coloring; ignored if colors is provided |
| encoder | dict or LabelEncoder | No | Mapping from target values to human-readable labels |
| fill_area | bool | No | If True (default), fill the area under the curve with the curve color |
| ap_score | bool | No | If True (default), annotate the plot with the average precision score |
| micro | bool | No | If True (default), plot the micro-averaged PR curve for multiclass; ignored for binary |
| iso_f1_curves | bool | No | If True, overlay ISO F1 curves on the plot; defaults to False |
| iso_f1_values | tuple of float | No | F1 values at which to draw ISO F1 curves; defaults to (0.2, 0.4, 0.6, 0.8) |
| per_class | bool | No | If True, plot per-class PR curves using OneVsRest; defaults to False |
| fill_opacity | float | No | Alpha of the area fill; defaults to 0.2 |
| line_opacity | float | No | Alpha of the curve lines; defaults to 0.8 |
| is_fitted | bool or str | No | Whether the estimator is already fitted; defaults to "auto" |
| force_model | bool | No | If True, skip the classifier type check on the estimator |
Outputs
| Name | Type | Description |
|---|---|---|
| score_ | float or dict | Average precision score; float for binary, dict keyed by class for multiclass |
| precision_ | ndarray or dict | Precision values at each threshold; array for binary, dict for multiclass |
| recall_ | ndarray or dict | Recall values at each threshold; array for binary, dict for multiclass |
| target_type_ | str | Detected target type, either "binary" or "multiclass" |
| ax | matplotlib Axes | The axes with the rendered precision-recall curve(s) |
Usage Examples
Basic Usage
from sklearn.linear_model import LogisticRegression
from sklearn.model_selection import train_test_split
from yellowbrick.classifier import PrecisionRecallCurve
from yellowbrick.datasets import load_spam
X, y = load_spam()
X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=42)
viz = PrecisionRecallCurve(LogisticRegression(), iso_f1_curves=True)
viz.fit(X_train, y_train)
viz.score(X_test, y_test)
viz.show()
Quick Method
from sklearn.linear_model import LogisticRegression
from yellowbrick.classifier.prcurve import precision_recall_curve
from yellowbrick.datasets import load_spam
X, y = load_spam()
X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=42)
precision_recall_curve(LogisticRegression(), X_train, y_train, X_test, y_test)