Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Implementation:LaurentMazare Tch rs Optimizer Backward Step

From Leeroopedia


Knowledge Sources
Domains Deep_Learning, Optimization
Last Updated 2026-02-08 14:00 GMT

Overview

Concrete tool for performing a combined backward pass and optimizer step provided by the tch nn module.

Description

Optimizer::backward_step is a convenience method that atomically performs zero_grad, loss.backward(), and optimizer.step(). It also calls add_missing_variables to handle dynamically added parameters. Variants include backward_step_clip (gradient value clipping) and backward_step_clip_norm (gradient L2 norm clipping).

Usage

Call this method in each training loop iteration with the computed loss tensor. This is the standard training step used across all tch-rs training workflows.

Code Reference

Source Location

  • Repository: tch-rs
  • File: src/nn/optimizer.rs
  • Lines: 260-265

Signature

pub fn backward_step(&mut self, loss: &Tensor)

Import

use tch::nn::{self, OptimizerConfig};

I/O Contract

Inputs

Name Type Required Description
loss &Tensor Yes Scalar loss tensor to backpropagate

Outputs

Name Type Description
() unit Parameters updated in-place (zero_grad -> backward -> step)

Usage Examples

use tch::{nn, nn::OptimizerConfig, nn::ModuleT, Device, Kind};

let vs = nn::VarStore::new(Device::Cpu);
// ... define model ...
let mut opt = nn::Adam::default().build(&vs, 1e-3)?;

for epoch in 1..=10 {
    for (xs, ys) in dataset.train_iter(64).shuffle().to_device(vs.device()) {
        let logits = model.forward_t(&xs, true);
        let loss = logits.cross_entropy_for_logits(&ys);
        opt.backward_step(&loss);
    }
}

Related Pages

Implements Principle

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment