Principle:Hpcaitech ColossalAI Ray Weight Synchronization
| Knowledge Sources | |
|---|---|
| Domains | Distributed_Computing, Infrastructure |
| Last Updated | 2026-02-09 00:00 GMT |
Overview
A distributed weight synchronization pattern using Ray collective operations to broadcast updated model parameters from training consumers to inference producers.
Description
After each policy update, the consumer must distribute the new model weights to all producers so they generate experiences with the latest policy. This uses Ray's collective communication library (built on NCCL/Gloo) to efficiently broadcast tensor dictionaries across actors. Special handling is required for bfloat16 tensors on Gloo backends.
Usage
Automatically called after each consumer training step. No manual invocation needed.
Theoretical Basis
The synchronization follows a broadcast pattern:
- Consumer gathers updated state_dict
- State_dict is broadcast to all producers via Ray collective NCCL broadcast
- Producers load the new weights into their inference models