Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Implementation:DistrictDataLabs Yellowbrick AlphaSelection Visualizer

From Leeroopedia


Knowledge Sources
Domains Machine_Learning, Regression, Visualization
Last Updated 2026-02-08 00:00 GMT

Overview

Concrete tool for visualizing the relationship between the regularization parameter alpha and model error during hyperparameter tuning, provided by the Yellowbrick library.

Description

The AlphaSelection visualizer demonstrates how different values of the regularization hyperparameter alpha influence model selection in penalized linear models. It wraps a Scikit-Learn "RegressorCV" estimator (such as RidgeCV, LassoCV, LassoLarsCV, or ElasticNetCV) and plots the alpha-error curve after the cross-validated fitting process completes.

The visualizer extracts the array of alpha values and corresponding cross-validated errors directly from the fitted estimator's attributes. It then plots a line chart of error against alpha, with a vertical dashed line marking the optimal alpha value selected by the estimator. This allows the practitioner to visually verify that the model is responding meaningfully to regularization and that the chosen alpha sits at a sensible location on the error curve.

The visualizer requires an estimator whose class name ends with "CV". For non-CV regressors, the companion ManualAlphaSelection visualizer is available, which iterates through alpha values manually using cross_val_score. For RidgeCV, the visualizer automatically sets store_cv_values=True to ensure the necessary data is available after fitting.

Usage

Use AlphaSelection when you need to:

  • Visualize how regularization strength affects cross-validated error in Ridge, Lasso, or ElasticNet models
  • Verify that the optimal alpha chosen by a "RegressorCV" estimator is at a meaningful minimum
  • Confirm that the range of alpha values searched is appropriate (the optimal alpha should not be at the boundary)
  • Diagnose whether a given regularization type (L1, L2, or ElasticNet) is effective for the dataset

Code Reference

Source Location

  • Repository: yellowbrick
  • File: yellowbrick/regressor/alphas.py
  • Class: Lines 40-222
  • Quick Method: Lines 373-433

Signature

class AlphaSelection(RegressionScoreVisualizer):
    def __init__(self, estimator, ax=None, is_fitted="auto", **kwargs)

Import

from yellowbrick.regressor import AlphaSelection

I/O Contract

Inputs

Name Type Required Description
estimator Scikit-Learn "RegressorCV" Yes A cross-validated regularization estimator whose class name ends with "CV" (e.g. LassoCV, RidgeCV, ElasticNetCV). A YellowbrickTypeError is raised if the name does not end with "CV".
ax matplotlib Axes No The axes to plot on. If None, the current axes are used or created.
is_fitted bool or str No Whether the estimator is already fitted. 'auto' checks automatically. Default: 'auto'.

Outputs

Name Type Description
estimator.alpha_ float The optimal alpha value selected by the cross-validated estimator.
ax matplotlib Axes The axes containing the alpha-error line chart with a vertical dashed line at the optimal alpha.

Usage Examples

Basic Usage

from sklearn.linear_model import LassoCV
from yellowbrick.regressor import AlphaSelection
from yellowbrick.datasets import load_concrete

# Load dataset
X, y = load_concrete()

# Create and fit the visualizer
viz = AlphaSelection(LassoCV())
viz.fit(X, y)
viz.show()

Quick Method

from sklearn.linear_model import LassoCV
from yellowbrick.regressor import alphas
from yellowbrick.datasets import load_concrete

X, y = load_concrete()

viz = alphas(LassoCV(), X, y)

Related Pages

Implements Principle

Requires Environment

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment