Implementation:Alibaba ROLL DistillPipeline Val
Appearance
| Knowledge Sources | |
|---|---|
| Domains | Knowledge_Distillation, Evaluation |
| Last Updated | 2026-02-07 20:00 GMT |
Overview
Concrete distillation validation method provided by the Alibaba ROLL library.
Description
The DistillPipeline.val method evaluates the student model's SFT loss on validation data without teacher logits.
Usage
Called at evaluation intervals during distillation training.
Code Reference
Source Location
- Repository: Alibaba ROLL
- File: roll/pipeline/distill/distill_pipeline.py
- Lines: L336-346
Signature
class DistillPipeline(BasePipeline):
@torch.no_grad()
def val(self) -> Dict:
"""
Validation for distillation pipeline (SFT loss only).
Returns:
Dict with student/val_loss metric
"""
Import
from roll.pipeline.distill.distill_pipeline import DistillPipeline
I/O Contract
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
| val_dataloader | DataLoader | Yes | Validation data (from pipeline state) |
Outputs
| Name | Type | Description |
|---|---|---|
| metrics | Dict | student/val_loss |
Usage Examples
if step % eval_steps == 0:
val_metrics = pipeline.val()
print(val_metrics) # {"student/val_loss": 1.85}
Related Pages
Implements Principle
Requires Environment
Environment Dependencies
This implementation requires the following environment constraints:
Heuristics Applied
No specific heuristics apply to this implementation.
Page Connections
Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment