Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Principle:Sktime Pytorch forecasting DeepAR Model Instantiation

From Leeroopedia


Knowledge Sources
Domains Time_Series, Deep_Learning, Probabilistic_Forecasting
Last Updated 2026-02-08 07:00 GMT

Overview

Technique for instantiating an autoregressive recurrent neural network model that produces probabilistic forecasts by learning the parameters of a probability distribution.

Description

DeepAR is an autoregressive RNN-based model designed for probabilistic time series forecasting. Unlike point forecast models, DeepAR learns the parameters of a probability distribution (e.g., Normal, NegativeBinomial) at each time step, enabling uncertainty quantification through sampling. The model uses an encoder-decoder architecture where the encoder processes historical context through recurrent layers, and the decoder generates future predictions autoregressively — each predicted sample is fed back as input for the next step. Model instantiation via from_dataset configures the architecture from dataset metadata and validates that targets are continuous (categorical targets are not supported).

Usage

Use DeepAR when probabilistic forecasts with calibrated uncertainty intervals are needed. It is particularly suited for: (1) datasets with many related time series that share statistical patterns, (2) scenarios requiring distributional output (e.g., inventory optimization where quantile estimates matter), and (3) univariate or low-covariate settings. Use MultivariateNormalDistributionLoss to convert DeepAR into a DeepVAR network for correlated multi-target forecasting.

Theoretical Basis

DeepAR models the conditional distribution of future values autoregressively:

p(yt+1:t+H|y1:t)=h=1Hpθ(yt+h|y1:t+h1)

At each step, the RNN hidden state encodes history and projects to distribution parameters:

ht=RNN(ht1,[yt1,xt]) (μt,σt)=MLP(ht) yt𝒩(μt,σt2)

Key design choices:

  • cell_type — LSTM or GRU recurrent cell
  • hidden_size — RNN hidden dimension (main capacity control)
  • rnn_layers — depth of RNN stack
  • loss — DistributionLoss determining output distribution family

Related Pages

Implemented By

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment