Implementation:Pyro ppl Pyro Pyro Param
| Knowledge Sources | |
|---|---|
| Domains | Probabilistic_Programming, Optimization |
| Last Updated | 2026-02-09 00:00 GMT |
Overview
Concrete tool for declaring and managing optimizable parameters in Pyro's global parameter store.
Description
pyro.param declares a named parameter that is stored in Pyro's global ParamStoreDict and optimized during SVI training. Parameters support PyTorch constraints (e.g., positivity) and are automatically transformed between constrained and unconstrained spaces. The parameter store provides persistence, serialization, and name-based lookup of all learnable parameters.
Usage
Use pyro.param to declare free parameters in custom variational guides. For most use cases, prefer AutoGuide classes (AutoNormal, AutoDelta) which manage parameters automatically. Direct param usage is needed for custom guide architectures or when fine-grained control over parameterization is required.
Code Reference
Source Location
- Repository: pyro
- File: pyro/primitives.py
- Lines: L57-91
Signature
def param(
name: str,
init_tensor: Union[torch.Tensor, Callable[[], torch.Tensor], None] = None,
constraint: constraints.Constraint = constraints.real,
event_dim: Optional[int] = None,
) -> torch.Tensor:
"""
Saves the variable as a parameter in the param store.
Args:
name: name of parameter
init_tensor: initial tensor or lazy callable
constraint: torch constraint, defaults to constraints.real
event_dim: optional number of rightmost dimensions unrelated to batching
Returns:
A constrained parameter tensor
"""
Import
import pyro
# Used as: pyro.param(name, init_tensor, ...)
I/O Contract
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
| name | str | Yes | Unique name for the parameter |
| init_tensor | Union[Tensor, Callable, None] | No | Initial value or lazy callable |
| constraint | constraints.Constraint | No | Constraint on parameter space (default: real) |
| event_dim | Optional[int] | No | Number of rightmost event dimensions |
Outputs
| Name | Type | Description |
|---|---|---|
| return | torch.Tensor | Constrained parameter tensor with gradient tracking |
Usage Examples
Custom Guide Parameters
import pyro
import pyro.distributions as dist
import torch
def guide(data):
# Learnable location and scale
mu_loc = pyro.param("mu_loc", torch.tensor(0.0))
mu_scale = pyro.param("mu_scale", torch.tensor(1.0),
constraint=dist.constraints.positive)
pyro.sample("mu", dist.Normal(mu_loc, mu_scale))