Implementation:Interpretml Interpret Show
Appearance
| Field | Value |
|---|---|
| Sources | InterpretML |
| Domains | Visualization, User_Interface |
| Last Updated | 2026-02-07 12:00 GMT |
Overview
Show is a concrete tool for rendering interactive explanation visualizations provided by the InterpretML library.
Description
The show function is the primary entry point for visualizing model explanations. It accepts one or more Explanation objects and a key selector, delegates to the auto-detected visualization provider, and renders interactive charts. With key=-1 it shows the overall summary; with key=N it shows the Nth term/sample.
Usage
Call show() whenever you want to visualize an explanation in a Jupyter notebook or browser.
Code Reference
Source Location
- Repository
interpretml/interpret- File
python/interpret-core/interpret/visual/_interactive.py- Lines
- 136--160
Signature
def show(explanation, key=-1, **kwargs):
"""Provides an interactive visualization for a given explanation(s).
Args:
explanation: Either a scalar Explanation or list of Explanations.
key: Specific index of explanation to visualize. -1 for overall.
**kwargs: Kwargs passed to provider's render() call.
"""
Import
from interpret import show
I/O Contract
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
explanation |
Explanation or list[Explanation] | Yes | One or more Explanation objects to visualize |
key |
int | No | Specific index of explanation to visualize; -1 for overall (default: -1) |
**kwargs |
keyword arguments | No | Kwargs passed to provider's render() call |
Outputs
| Name | Type | Description |
|---|---|---|
| None | None | Side effect: renders visualization in notebook or browser |
Usage Examples
Global and Local Visualization
from interpret.glassbox import ExplainableBoostingClassifier
from interpret import show
ebm = ExplainableBoostingClassifier()
ebm.fit(X_train, y_train)
# Global explanation
global_exp = ebm.explain_global()
show(global_exp) # Overall importance
show(global_exp, key=0) # First feature shape
# Local explanation
local_exp = ebm.explain_local(X_test[:5], y_test[:5])
show(local_exp, key=0) # First sample explanation
# Compare multiple explanations
show([global_exp, local_exp])
Related Pages
Page Connections
Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment