Implementation:Online ml River Metrics SMAPE
| Knowledge Sources | |
|---|---|
| Domains | Online_Learning, Evaluation_Metrics, Regression |
| Last Updated | 2026-02-08 16:00 GMT |
Overview
Symmetric Mean Absolute Percentage Error measuring scale-independent prediction error with balanced treatment of over and under-predictions.
Description
SMAPE computes 100 × mean(2 × |y_true - y_pred| / (|y_true| + |y_pred|)), providing a symmetric alternative to MAPE that treats over-predictions and under-predictions equally. It returns 0 when both y_true and y_pred are 0. SMAPE is bounded between 0% (perfect) and 200% (worst case), though typically ranges 0-100%. Unlike MAPE, it's less sensitive to the direction of error.
Usage
Use SMAPE when you need percentage-based error metrics but want symmetric treatment of over and under-predictions, or when MAPE's asymmetry is problematic. SMAPE is more robust than MAPE when true values are near zero, though it still has limitations with zeros. It's particularly useful in forecasting applications where errors in both directions should be penalized equally.
Code Reference
Source Location
- Repository: Online_ml_River
- File: river/metrics/smape.py
Signature
class SMAPE(metrics.base.MeanMetric, metrics.base.RegressionMetric):
def __init__(self):
pass
Import
from river import metrics
I/O Contract
| Method | Parameters | Returns | Description |
|---|---|---|---|
| update | y_true (float), y_pred (float), [w] | None | Updates metric with true and predicted values |
| get | - | float | Returns symmetric MAPE (0 to 200, lower is better) |
Usage Examples
from river import metrics
y_true = [0, 0.07533, 0.07533, 0.07533, 0.07533, 0.07533, 0.07533, 0.0672, 0.0672]
y_pred = [0, 0.102, 0.107, 0.047, 0.1, 0.032, 0.047, 0.108, 0.089]
metric = metrics.SMAPE()
for yt, yp in zip(y_true, y_pred):
metric.update(yt, yp)
print(metric)
# SMAPE: 37.869392
# Interpretation: Average symmetric percentage error is ~37.87%
# Compare SMAPE symmetry with MAPE
y_true_sym = [100, 100]
y_pred_over = [150, 150] # 50% over-prediction
y_pred_under = [50, 50] # 50% under-prediction
# MAPE (asymmetric)
mape_over = metrics.MAPE()
mape_under = metrics.MAPE()
for yt, yp in zip(y_true_sym, y_pred_over):
mape_over.update(yt, yp)
for yt, yp in zip(y_true_sym, y_pred_under):
mape_under.update(yt, yp)
print(f"MAPE over-prediction: {mape_over.get():.1f}%")
print(f"MAPE under-prediction: {mape_under.get():.1f}%")
# MAPE: Different values for over vs under-prediction
# SMAPE (symmetric)
smape_over = metrics.SMAPE()
smape_under = metrics.SMAPE()
for yt, yp in zip(y_true_sym, y_pred_over):
smape_over.update(yt, yp)
for yt, yp in zip(y_true_sym, y_pred_under):
smape_under.update(yt, yp)
print(f"SMAPE over-prediction: {smape_over.get():.1f}%")
print(f"SMAPE under-prediction: {smape_under.get():.1f}%")
# SMAPE: Same value for equal magnitude errors in opposite directions