Implementation:LaurentMazare Tch rs Adam Build
Appearance
| Knowledge Sources | |
|---|---|
| Domains | Deep_Learning, Optimization |
| Last Updated | 2026-02-08 14:00 GMT |
Overview
Concrete tool for constructing the Adam optimizer for gradient-based training provided by the tch nn module.
Description
nn::Adam::default().build(&vs, lr) creates an Optimizer wrapping a C++ Adam optimizer with all trainable variables from the given VarStore registered for optimization. The optimizer provides backward_step for combined zero_grad + backward + step operations, and set_lr for learning rate scheduling.
Usage
Use after creating a VarStore and defining all model layers. Pass the VarStore reference and a learning rate to bind the optimizer to all trainable parameters.
Code Reference
Source Location
- Repository: tch-rs
- File: src/nn/optimizer.rs
- Lines: 73-76 (Adam::default), 23-34 (OptimizerConfig::build)
Signature
// Adam configuration
impl Default for Adam {
fn default() -> Self {
Adam { beta1: 0.9, beta2: 0.999, wd: 0., eps: 1e-8, amsgrad: false }
}
}
// OptimizerConfig::build (shared by all optimizers)
fn build(self, vs: &VarStore, lr: f64) -> Result<Optimizer, TchError>
Import
use tch::nn::{self, OptimizerConfig};
I/O Contract
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
| vs | &VarStore | Yes | Variable store whose trainable parameters to optimize |
| lr | f64 | Yes | Learning rate |
Outputs
| Name | Type | Description |
|---|---|---|
| Result<Optimizer> | nn::Optimizer | Optimizer wrapping C++ Adam with all trainable variables registered |
Usage Examples
use tch::nn::{self, OptimizerConfig};
let vs = nn::VarStore::new(tch::Device::Cpu);
// ... define model layers using vs ...
let mut opt = nn::Adam::default().build(&vs, 1e-3)?;
// Training step
let loss = model.forward(&input).cross_entropy_for_logits(&labels);
opt.backward_step(&loss);
Related Pages
Implements Principle
Page Connections
Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment