Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Principle:LaurentMazare Tch rs TorchScript Loading

From Leeroopedia


Knowledge Sources
Domains Model_Deployment, Interoperability
Last Updated 2026-02-08 14:00 GMT

Overview

Mechanism for loading a serialized TorchScript model from disk into a Rust-native module for inference.

Description

TorchScript loading deserializes a .pt file (exported from Python) into a CModule that wraps the libtorch C++ torch::jit::Module. The loaded module contains both the model architecture and weights, ready for immediate inference. Loading supports device specification, enabling a GPU-exported model to be loaded on CPU. The CModule implements the Module trait for seamless integration with tch-rs's tensor and module ecosystem.

Usage

Use to load any TorchScript model for Rust-based inference. For inference-only, use CModule::load. For training in Rust, use TrainableCModule::load which registers parameters in a VarStore.

Theoretical Basis

Loading Variants:
  CModule::load(path)                    → Inference only
  CModule::load_on_device(path, device)  → Load with device override
  TrainableCModule::load(path, vs_path)  → Training (registers params in VarStore)

CModule wraps C++ torch::jit::Module:
  - Preserves original computation graph
  - Supports forward_ts (tensor inputs) and forward_is (IValue inputs)
  - Implements Module trait for Tensor::apply compatibility

Related Pages

Implemented By

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment