Implementation:Microsoft Onnxruntime InferenceSession Init
Appearance
Metadata
| Field | Value |
|---|---|
| Implementation Name | InferenceSession_Init |
| Repository | Microsoft_Onnxruntime |
| Source Repository | https://github.com/microsoft/onnxruntime |
| Type | API Doc |
| Language | Python |
| Domain | ML_Inference, Model_Optimization |
| Last Updated | 2026-02-10 |
| Workflow | Python_Inference_Pipeline |
| Pair | 2 of 6 |
Overview
API documentation for the onnxruntime.InferenceSession() constructor, which loads an ONNX model and prepares it for inference with specified execution providers.
API Signature
onnxruntime.InferenceSession(model_path, sess_options=None, providers=None)
Import
from onnxruntime import InferenceSession
Code Reference
| Reference | Location |
|---|---|
| Import definition | onnxruntime/__init__.py:L82 (imported from capi.onnxruntime_inference_collection) |
| Usage example | docs/python/examples/plot_load_and_predict.py:L25 |
I/O Contract
Inputs
| Parameter | Type | Required | Description |
|---|---|---|---|
model_path |
str or bytes |
Yes | Path to an ONNX model file on disk, or a serialized ONNX model as bytes. |
sess_options |
SessionOptions |
No | A configured SessionOptions object controlling optimization, profiling, and threading behavior. |
providers |
list[str] |
No | Ordered list of execution provider names. Providers are tried in order for each operator. |
Outputs
| Output | Type | Description |
|---|---|---|
| return value | InferenceSession |
A fully initialized session ready for inference via the .run() method.
|
Usage Example
import onnxruntime as rt
# Basic usage: load model with all available providers
sess = rt.InferenceSession("model.onnx", providers=rt.get_available_providers())
# With explicit options and provider selection
options = rt.SessionOptions()
sess = rt.InferenceSession(
"model.onnx",
options,
providers=['CUDAExecutionProvider', 'CPUExecutionProvider']
)
From the source example at docs/python/examples/plot_load_and_predict.py:L25:
example1 = get_example("sigmoid.onnx")
sess = rt.InferenceSession(example1, providers=rt.get_available_providers())
From the profiling example at docs/python/examples/plot_profiling.py:L38:
sess = rt.InferenceSession(onnx_model_str, providers=rt.get_available_providers())
Related Pages
Page Connections
Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment