Implementation:Onnx Onnx Load Model
| Knowledge Sources | |
|---|---|
| Domains | Deserialization, Model_Persistence |
| Last Updated | 2026-02-10 00:00 GMT |
Overview
Concrete tool for loading serialized ONNX models into memory provided by the ONNX top-level package.
Description
The load_model function reads an ONNX model file from disk and deserializes it into an in-memory ModelProto object. It uses a pluggable serialization registry (onnx.serialization) to detect the format from the file extension and dispatch to the appropriate deserializer. When load_external_data is True (the default), the function also loads tensor data stored in external files into the model's tensor raw_data fields.
Usage
Import this function as the first step in any workflow that operates on existing ONNX models. The returned ModelProto can then be passed to checker.check_model, shape_inference.infer_shapes, version_converter.convert_version, compose.merge_models, or ReferenceEvaluator.
Code Reference
Source Location
- Repository: onnx
- File: onnx/__init__.py
- Lines: 206-235
Signature
def load_model(
f: IO[bytes] | str | os.PathLike,
format: str | None = None,
load_external_data: bool = True,
) -> ModelProto:
"""Loads a serialized ModelProto into memory.
Args:
f: File-like object or string/PathLike with a file name.
format: Serialization format. Auto-detected from extension if None.
Defaults to 'protobuf' if format cannot be determined.
load_external_data: Whether to load external tensor data.
Set to False if external data is in a different directory.
Returns:
Loaded in-memory ModelProto.
"""
Import
import onnx
I/O Contract
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
| f | IO[bytes] or str or PathLike | Yes | File path or file-like object to read from |
| format | str or None | No | Serialization format (auto-detected from extension) |
| load_external_data | bool | No | Load external tensor data (default: True) |
Outputs
| Name | Type | Description |
|---|---|---|
| return | ModelProto | In-memory model with all tensor data loaded |
Usage Examples
Basic Loading
import onnx
# Load from binary protobuf (most common)
model = onnx.load_model("model.onnx")
# Load from JSON format
model = onnx.load_model("model.json", format="json")
Deferred External Data Loading
import onnx
from onnx.external_data_helper import load_external_data_for_model
# Load model structure without external data
model = onnx.load_model("model.onnx", load_external_data=False)
# Load external data from a different directory
load_external_data_for_model(model, "/path/to/external/data/")