Implementation:LaurentMazare Tch rs Nn Seq
Appearance
| Knowledge Sources | |
|---|---|
| Domains | Deep_Learning, Model_Architecture |
| Last Updated | 2026-02-08 14:00 GMT |
Overview
Concrete tool for building sequential layer chains provided by the tch nn module.
Description
nn::seq creates an empty Sequential container. Layers are added via the builder pattern using .add() for Module layers and .add_fn() for closure-based transformations. The Sequential container implements the Module trait, enabling it to be used as a single layer in larger architectures. A SequentialT variant is available for layers requiring the train flag (e.g., dropout, batch norm).
Usage
Use this when constructing feedforward architectures. Call nn::seq() then chain .add() calls to compose your network.
Code Reference
Source Location
- Repository: tch-rs
- File: src/nn/sequential.rs
- Lines: 12-14
Signature
pub fn seq() -> Sequential
Import
use tch::nn;
I/O Contract
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
| (none) | — | — | No arguments; layers added via builder pattern |
Outputs
| Name | Type | Description |
|---|---|---|
| Sequential | nn::Sequential | Empty layer container implementing Module trait |
Usage Examples
Two-Layer Feedforward Network
use tch::{nn, nn::Module};
let vs = nn::VarStore::new(tch::Device::Cpu);
let net = nn::seq()
.add(nn::linear(vs.root() / "layer1", 784, 128, Default::default()))
.add_fn(|xs| xs.relu())
.add(nn::linear(vs.root() / "layer2", 128, 10, Default::default()));
// Forward pass
let input = tch::Tensor::randn([1, 784], tch::kind::FLOAT_CPU);
let output = net.forward(&input); // shape: [1, 10]
Related Pages
Implements Principle
Page Connections
Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment