Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Principle:Sktime Pytorch forecasting Quantile Loss

From Leeroopedia


Knowledge Sources
Domains Time_Series, Loss_Functions, Probabilistic_Forecasting
Last Updated 2026-02-08 07:00 GMT

Overview

Loss function that trains models to predict multiple quantiles of the target distribution, enabling probabilistic forecasts without assuming a parametric distribution.

Description

Quantile Loss (also called pinball loss) is a non-parametric approach to probabilistic forecasting. Instead of learning distribution parameters (like DeepAR's NormalDistributionLoss), quantile loss directly optimizes the model to predict specific quantiles (e.g., the 2nd, 10th, 25th, 50th, 75th, 90th, and 98th percentiles). Each quantile has an asymmetric loss that penalizes under-prediction and over-prediction differently. The median quantile (q=0.5) corresponds to half the Mean Absolute Error. This approach is distribution-free and naturally produces calibrated prediction intervals.

Usage

Use QuantileLoss as the default loss for the Temporal Fusion Transformer and other multi-horizon models that output multiple quantiles per time step. It is appropriate when: (1) prediction intervals are needed, (2) no strong distributional assumption is warranted, and (3) the model architecture outputs a vector of quantile predictions per horizon.

Theoretical Basis

Quantile loss for a single quantile q:

ρq(y,y^)=max(q(yy^),(1q)(y^y))

Equivalently:

ρq(y,y^)=(yy^)(q𝟙y<y^)

Multi-quantile loss (summed over K quantiles):

=2Kk=1Kρqk(y,y^qk)

The factor of 2 normalizes so that q=0.5 gives the MAE.

Default quantiles: [0.02, 0.1, 0.25, 0.5, 0.75, 0.9, 0.98] — providing a 96% prediction interval (0.02 to 0.98).

Related Pages

Implemented By

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment