Implementation:LaurentMazare Tch rs CModule Load
| Knowledge Sources | |
|---|---|
| Domains | Model_Deployment, Interoperability |
| Last Updated | 2026-02-08 14:00 GMT |
Overview
Concrete tool for loading TorchScript .pt models for inference in Rust provided by the tch wrappers module.
Description
CModule::load deserializes a TorchScript .pt file into a CModule wrapper around the C++ torch::jit::Module. The variant CModule::load_on_device additionally specifies the target device, allowing GPU-exported models to be loaded on CPU. The CModule implements the Module trait and provides forward_ts (tensor slice input) and forward_is (IValue input) methods.
Usage
Use for inference-only scenarios where you do not need to train the model in Rust. For training, use TrainableCModule instead.
Code Reference
Source Location
- Repository: tch-rs
- File: src/wrappers/jit.rs
- Lines: 435-439 (load), 445-452 (load_on_device)
Signature
impl CModule {
pub fn load<T: AsRef<std::path::Path>>(path: T) -> Result<CModule, TchError>
pub fn load_on_device<T: AsRef<std::path::Path>>(
path: T,
device: Device,
) -> Result<CModule, TchError>
}
Import
use tch::CModule;
I/O Contract
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
| path | T: AsRef<Path> | Yes | Path to TorchScript .pt file |
| device | Device | No | Target device (only for load_on_device) |
Outputs
| Name | Type | Description |
|---|---|---|
| Result<CModule> | CModule | Loaded JIT module implementing Module trait |
Usage Examples
use tch::{CModule, Device, nn::Module, vision::imagenet};
// Load on default device
let model = CModule::load("resnet18.pt")?;
// Load GPU model on CPU
let model = CModule::load_on_device("resnet18.pt", Device::Cpu)?;
// Inference
let image = imagenet::load_image_and_resize224("photo.jpg")?;
let output = image.unsqueeze(0).apply(&model);