Principle:LaurentMazare Tch rs Forward Pass Inference
| Knowledge Sources | |
|---|---|
| Domains | Deep_Learning, Model_Inference |
| Last Updated | 2026-02-08 14:00 GMT |
Overview
The forward pass computes model output from input data by propagating through all layers, with a train flag controlling behavior of layers like dropout and batch normalization.
Description
The forward pass takes an input tensor and propagates it through all layers of a neural network to produce an output (typically logits or probabilities). The ModuleT trait provides forward_t which accepts a train boolean flag. During inference (train=false), dropout is disabled and batch normalization uses running statistics rather than batch statistics. This distinction is critical for reproducible and correct inference results.
Usage
Use after loading a model with pretrained weights. Always set train=false for inference to ensure correct behavior of normalization and regularization layers.
Theoretical Basis
Module trait: forward(&self, xs: &Tensor) -> Tensor
ModuleT trait: forward_t(&self, xs: &Tensor, train: bool) -> Tensor
Blanket impl: Module automatically satisfies ModuleT (ignoring train flag)
Train flag effects:
- BatchNorm: train=true uses batch stats; train=false uses running stats
- Dropout: train=true drops randomly; train=false passes through