Implementation:Sktime Pytorch forecasting NormalDistributionLoss
| Knowledge Sources | |
|---|---|
| Domains | Time_Series, Loss_Functions, Probabilistic_Forecasting |
| Last Updated | 2026-02-08 07:00 GMT |
Overview
Concrete tool for computing Normal distribution negative log-likelihood loss for probabilistic forecasting provided by the pytorch-forecasting library.
Description
The NormalDistributionLoss class extends DistributionLoss to model targets as Gaussian-distributed. It uses PyTorch's torch.distributions.Normal as the underlying distribution. The model predicts two parameters per time step: loc (mean) and scale (standard deviation, enforced positive via softplus). The rescale_parameters method applies target normalization parameters (center, scale from GroupNormalizer) as an affine transformation. The map_x_to_distribution method constructs the full TransformedDistribution with optional inverse transformations (log, logit, etc.) from the normalizer.
Usage
Use as the default loss for DeepAR models. It is automatically set when no loss is specified in DeepAR.from_dataset(). Requires the target to be continuous and the normalizer to be a GroupNormalizer or TorchNormalizer (not NaNLabelEncoder).
Code Reference
Source Location
- Repository: pytorch-forecasting
- File: pytorch_forecasting/metrics/distributions.py
- Lines: L18-58
Signature
class NormalDistributionLoss(DistributionLoss):
"""Normal distribution loss."""
distribution_class = distributions.Normal
distribution_arguments = ["loc", "scale"]
def map_x_to_distribution(
self, x: torch.Tensor
) -> distributions.TransformedDistribution:
"""Map network output to Normal distribution with affine rescaling."""
def rescale_parameters(
self,
parameters: torch.Tensor,
target_scale: torch.Tensor,
encoder: BaseEstimator,
) -> torch.Tensor:
"""Apply softplus to scale and prepend target normalization params."""
Import
from pytorch_forecasting.metrics import NormalDistributionLoss
I/O Contract
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
| (constructor) | — | — | No required arguments; inherits from DistributionLoss |
| parameters | torch.Tensor | Yes (to rescale) | Raw model output (batch, horizon, 2) — loc and scale |
| target_scale | torch.Tensor | Yes (to rescale) | Normalization center and scale from GroupNormalizer |
Outputs
| Name | Type | Description |
|---|---|---|
| loss() | torch.Tensor | Negative log-likelihood per sample |
| map_x_to_distribution() | TransformedDistribution | Full Normal distribution with rescaling transforms |
Usage Examples
DeepAR with Normal Loss
from pytorch_forecasting import DeepAR
from pytorch_forecasting.metrics import NormalDistributionLoss
model = DeepAR.from_dataset(
training,
loss=NormalDistributionLoss(),
learning_rate=0.1,
hidden_size=32,
)
Related Pages
Implements Principle
Requires Environment
- Environment:Sktime_Pytorch_forecasting_Core_Python_Dependencies
- Environment:Sktime_Pytorch_forecasting_Cpflows_MQF2_Dependencies