Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Implementation:Sktime Pytorch forecasting DeepAR From Dataset

From Leeroopedia


Knowledge Sources
Domains Time_Series, Deep_Learning, Probabilistic_Forecasting
Last Updated 2026-02-08 07:00 GMT

Overview

Concrete tool for instantiating a DeepAR probabilistic forecasting model from a TimeSeriesDataSet provided by the pytorch-forecasting library.

Description

The DeepAR.from_dataset factory method creates a DeepAR model instance configured from dataset metadata. It infers variable lists, embedding sizes, and target configuration from the training TimeSeriesDataSet. The method validates that targets are continuous (not categorical) and optionally configures multi-target support with MultiLoss. The default loss is NormalDistributionLoss which models the target as a Gaussian distribution.

Usage

Call this method after constructing a training TimeSeriesDataSet configured for DeepAR (with add_target_scales=True and GroupNormalizer). The model is then passed to a Trainer for training.

Code Reference

Source Location

  • Repository: pytorch-forecasting
  • File: pytorch_forecasting/models/deepar/_deepar.py
  • Lines: L209-254 (from_dataset), L47-159 (__init__)

Signature

class DeepAR(AutoRegressiveBaseModelWithCovariates):
    @classmethod
    def from_dataset(
        cls,
        dataset: TimeSeriesDataSet,
        allowed_encoder_known_variable_names: list[str] = None,
        **kwargs,
    ):
        """
        Create model from dataset.

        Args:
            dataset: timeseries dataset
            allowed_encoder_known_variable_names: List of known variables
                allowed in encoder, defaults to all
            **kwargs: additional arguments such as hyperparameters

        Returns:
            DeepAR network
        """

    def __init__(
        self,
        cell_type: str = "LSTM",
        hidden_size: int = 10,
        rnn_layers: int = 2,
        dropout: float = 0.1,
        n_validation_samples: int = None,
        n_plotting_samples: int = None,
        target: str | list[str] = None,
        target_lags: dict[str, list[int]] | None = None,
        loss: DistributionLoss = None,  # defaults to NormalDistributionLoss()
        logging_metrics: nn.ModuleList = None,
        **kwargs,
    ):

Import

from pytorch_forecasting import DeepAR

I/O Contract

Inputs

Name Type Required Description
dataset TimeSeriesDataSet Yes Training dataset to infer architecture from
learning_rate float No Learning rate (default: 0.1)
hidden_size int No RNN hidden size (default: 10)
rnn_layers int No Number of RNN layers (default: 2)
dropout float No Dropout in RNN layers (default: 0.1)
cell_type str No "LSTM" or "GRU" (default: "LSTM")
loss DistributionLoss No Probabilistic loss (default: NormalDistributionLoss())
target_lags dict No Dictionary of target lags for seasonality hints

Outputs

Name Type Description
return DeepAR Configured DeepAR model ready for training

Usage Examples

Standard DeepAR Instantiation

from pytorch_forecasting import DeepAR
from pytorch_forecasting.metrics import NormalDistributionLoss

model = DeepAR.from_dataset(
    training,
    learning_rate=0.1,
    hidden_size=32,
    rnn_layers=2,
    dropout=0.1,
    loss=NormalDistributionLoss(),
)

print(f"Number of parameters: {model.size() / 1e3:.1f}k")

DeepAR with GRU Cell

model = DeepAR.from_dataset(
    training,
    learning_rate=0.1,
    hidden_size=64,
    rnn_layers=3,
    cell_type="GRU",
    loss=NormalDistributionLoss(),
)

Related Pages

Implements Principle

Requires Environment

Uses Heuristic

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment