Environment:Onnx Onnx Python Runtime Environment
| Knowledge Sources | |
|---|---|
| Domains | Infrastructure, ML_Framework |
| Last Updated | 2026-02-10 02:00 GMT |
Overview
Python 3.10+ environment with NumPy, Protobuf, typing_extensions, and ml_dtypes for running the ONNX Python API.
Description
This environment provides the standard Python runtime context for using the ONNX library. It includes the core Python package dependencies required to create, validate, inspect, convert, and evaluate ONNX models via the Python API. The environment supports all major platforms (Linux, macOS, Windows) and architectures, with a platform-specific variant for s390x (IBM Z) requiring a higher ml_dtypes version. Python 3.12+ enables stable ABI wheel support for faster installs. The C++ extension module (onnx_cpp2py_export) is included as a pre-built binary wheel for checker, shape inference, version converter, and printer functionality.
Usage
Use this environment for any workflow that uses the ONNX Python API: model creation via onnx.helper, model validation via onnx.checker, shape inference via onnx.shape_inference, model composition via onnx.compose, version conversion via onnx.version_converter, reference evaluation via onnx.reference, and external data handling via onnx.external_data_helper. This is the mandatory prerequisite for all Python-based ONNX workflows.
System Requirements
| Category | Requirement | Notes |
|---|---|---|
| OS | Linux, macOS, or Windows | All major platforms supported; FreeBSD lacks stable ABI wheels |
| Python | >= 3.10 | Python 3.12+ enables stable ABI (Limited API) wheel support |
| Architecture | x86_64, ARM64, s390x | s390x requires ml_dtypes >= 0.5.4 |
| Disk | ~50MB | Package size including C++ extensions |
Dependencies
Python Packages
- `numpy` >= 1.23.2
- Python 3.10/3.11: minimum 1.23.2
- Python 3.12: minimum 1.26.0
- Python 3.13: minimum 2.1.0
- Python 3.14+: minimum 2.3.2
- `protobuf` >= 4.25.1
- Python 3.14+ on macOS: requires protobuf >= 6.31.0
- `typing_extensions` >= 4.7.1
- `ml_dtypes` >= 0.5.0
- s390x architecture: requires ml_dtypes >= 0.5.4
Optional Python Packages
- `ml_dtypes` (for float8, float4, int4, int2 type support) - included by default in requirements
- `numpy` (for reference evaluator) - included by default
Credentials
No credentials are required for the core ONNX library.
The ONNX Model Hub (onnx.hub) uses the following optional environment variables:
- `ONNX_HOME`: Override the default model cache directory location.
- `XDG_CACHE_HOME`: Standard Unix cache directory; models are stored in `$XDG_CACHE_HOME/onnx/hub`.
Quick Install
# Install ONNX with all core dependencies
pip install onnx
# Install with reference evaluator support (adds numpy)
pip install onnx[reference]
# For minimum supported versions (testing)
pip install numpy==1.23.2 protobuf==4.25.1 typing_extensions==4.7.1 ml_dtypes==0.5.0
Code Evidence
Python version requirement from `pyproject.toml:22`:
requires-python = ">=3.10"
Runtime dependencies from `requirements.txt:1-5`:
numpy>=1.23.2
protobuf>=4.25.1
typing_extensions>=4.7.1
ml_dtypes>=0.5.0; platform_machine != "s390x"
ml_dtypes>=0.5.4; platform_machine == "s390x"
Minimum version matrix from `requirements-min.txt:4-10`:
protobuf==4.25.1; python_version<"3.14" or sys_platform != "darwin"
protobuf==6.31.0; python_version>="3.14" and sys_platform == "darwin"
numpy==1.23.2; python_version=="3.10"
numpy==1.23.2; python_version=="3.11"
numpy==1.26.0; python_version=="3.12"
numpy==2.1.0; python_version=="3.13"
numpy==2.3.2; python_version>="3.14"
ml_dtypes usage from `onnx/_mapping.py:8`:
import ml_dtypes
Limited API detection from `setup.py:337-339`:
NO_GIL = hasattr(sys, "_is_gil_enabled") and not sys._is_gil_enabled()
PY_312_OR_NEWER = sys.version_info >= (3, 12)
USE_LIMITED_API = not NO_GIL and PY_312_OR_NEWER and platform.system() != "FreeBSD"
Common Errors
| Error Message | Cause | Solution |
|---|---|---|
| `ImportError: cannot import name 'onnx_cpp2py_export'` | C++ extension not built or not found | Reinstall onnx: `pip install --force-reinstall onnx` |
| `ValueError: The proto size is larger than the 2 GB limit` | Model exceeds protobuf 2GB limit | Use `onnx.save_model()` with `save_as_external_data=True` |
| `ImportError: ml_dtypes` | ml_dtypes package not installed | `pip install ml_dtypes>=0.5.0` |
| `protobuf version mismatch` | Protobuf version too old | `pip install protobuf>=4.25.1` |
Compatibility Notes
- s390x (IBM Z): Requires ml_dtypes >= 0.5.4 instead of the standard >= 0.5.0.
- Python 3.14+ on macOS: Requires protobuf >= 6.31.0 due to a platform-specific compatibility issue.
- FreeBSD: Stable ABI (Limited API) wheels are not supported; uses standard Python ABI.
- Free-threaded Python (no GIL): Limited API wheels are not generated when GIL is disabled.
- Big-endian systems: ONNX stores tensors in little-endian format internally; byte-swapping is performed automatically by numpy_helper.
Related Pages
- Implementation:Onnx_Onnx_Helper_Make_Tensor_Value_Info
- Implementation:Onnx_Onnx_Helper_Make_Node
- Implementation:Onnx_Onnx_Helper_Make_Graph
- Implementation:Onnx_Onnx_Helper_Make_Model
- Implementation:Onnx_Onnx_Checker_Check_Model
- Implementation:Onnx_Onnx_Save_Model
- Implementation:Onnx_Onnx_Load_Model
- Implementation:Onnx_Onnx_Shape_Inference_Infer_Shapes
- Implementation:Onnx_Onnx_Printer_To_Text
- Implementation:Onnx_Onnx_Helper_Version_Table
- Implementation:Onnx_Onnx_Version_Converter_Convert_Version
- Implementation:Onnx_Onnx_Compose_Add_Prefix
- Implementation:Onnx_Onnx_Compose_Expand_Out_Dim
- Implementation:Onnx_Onnx_Compose_IO_Map
- Implementation:Onnx_Onnx_Compose_Merge_Models
- Implementation:Onnx_Onnx_ReferenceEvaluator_Init
- Implementation:Onnx_Onnx_Numpy_Helper_Conversion
- Implementation:Onnx_Onnx_ReferenceEvaluator_Run
- Implementation:Onnx_Onnx_Numpy_Assert_Allclose
- Implementation:Onnx_Onnx_Load_External_Data_For_Model
- Implementation:Onnx_Onnx_Convert_Model_To_External_Data
- Implementation:Onnx_Onnx_Save_Model_External
- Implementation:Onnx_Onnx_Check_Model_Path