Environment:Allenai Open instruct Python 3 12 Runtime
| Knowledge Sources | |
|---|---|
| Domains | Infrastructure |
| Last Updated | 2026-02-07 00:00 GMT |
Overview
Python 3.12 runtime environment strictly required by Open Instruct, managed via the uv package manager.
Description
The project requires exactly Python 3.12 (not 3.11 or 3.13). This is enforced in `pyproject.toml` with `requires-python = "==3.12.*"`. The uv package manager handles dependency resolution with platform-specific indices for PyTorch (CUDA 12.9 for Linux x86_64, CUDA 13.0 for aarch64, CPU for macOS).
Usage
This environment is a prerequisite for all operations: training, evaluation, data processing, and testing. The Python 3.12 constraint ensures compatibility across all dependencies including torch, vLLM, DeepSpeed, and transformers.
System Requirements
| Category | Requirement | Notes |
|---|---|---|
| Python | 3.12.x exactly | Enforced via pyproject.toml |
| Package Manager | uv (recommended) | Handles platform-specific dependency resolution |
| OS | Linux (x86_64, aarch64) or macOS | Windows not supported |
Dependencies
Core Python Packages
- `accelerate` >= 1.10.1
- `datasets` >= 4.0.0
- `numpy` >= 2
- `torch` >= 2.9.0, < 2.10
- `transformers` >= 4.57.0
- `wandb` == 0.23.1
- `ray[default]` >= 2.49.2
- `deepspeed` >= 0.18.3
- `peft` >= 0.13.2
- `ai2-olmo-core` == 2.3.0
- `vllm` == 0.14.1 (Linux only)
Development Dependencies
- `beaker-py` >= 2.5.3
- `pytest` >= 8.3.4
- `ruff` >= 0.11.13
- `mkdocs-material` >= 9.6.8
Credentials
No credentials required for the Python runtime itself.
Quick Install
# Install uv if not already installed
curl -LsSf https://astral.sh/uv/install.sh | sh
# Install Python 3.12 and all dependencies
uv sync
Code Evidence
Python version constraint from `pyproject.toml:6`:
requires-python = "==3.12.*"
Platform-specific PyTorch source indices from `pyproject.toml:56-59`:
[tool.uv.sources]
torch = [
{ index = "pytorch-cu129", marker = "platform_system == 'Linux' and platform_machine != 'aarch64'"},
{ index = "pytorch-cu130", marker = "platform_system == 'Linux' and platform_machine == 'aarch64'"},
]
Supported platforms from `pyproject.toml:89-93`:
environments = [
"sys_platform == 'linux' and platform_machine == 'x86_64'",
"sys_platform == 'linux' and platform_machine == 'aarch64'",
"sys_platform == 'darwin'",
]
Common Errors
| Error Message | Cause | Solution |
|---|---|---|
| `requires-python not satisfied` | Using Python 3.11 or 3.13 | Install Python 3.12 via uv or pyenv |
| `ModuleNotFoundError: No module named 'vllm'` | Running on macOS where vLLM is excluded | Use Linux for vLLM-dependent features (GRPO training) |
| `flash-attn build fails` | Missing CUDA toolkit or on unsupported platform | Ensure CUDA toolkit installed; on macOS/ARM use `attn_implementation="sdpa"` instead |
Compatibility Notes
- Linux x86_64: Full support with CUDA 12.9 PyTorch index.
- Linux aarch64: Full support with CUDA 13.0 PyTorch index. flash-attn not available.
- macOS (Darwin): Limited support; vLLM, bitsandbytes, flash-attn, and liger-kernel excluded.
- Windows: Not supported at all.