Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Environment:Kornia Kornia ONNX Runtime Environment

From Leeroopedia


Knowledge Sources
Domains Infrastructure, Model_Deployment
Last Updated 2026-02-09 15:00 GMT

Overview

Optional environment extending the core Kornia stack with ONNX and ONNX Runtime for model export, loading, and inference.

Description

This environment provides the additional dependencies required for Kornia's ONNX workflow. The onnx package is needed for model graph manipulation and export, while onnxruntime provides the inference engine. These are optional dependencies managed through Kornia's LazyLoader system, which means they are only imported when ONNX functionality is actually used. The LazyLoader can automatically install missing dependencies or prompt the user, depending on configuration.

Usage

Use this environment when working with the ONNX Model Pipeline workflow, including loading ONNX models from files or HuggingFace, building sequential ONNX pipelines, running ONNX inference, or exporting combined ONNX graphs. This is required for all `kornia.onnx` module functionality.

System Requirements

Category Requirement Notes
OS Linux, macOS, Windows Cross-platform ONNX Runtime support
Hardware CPU (minimum) GPU optional; use `onnxruntime-gpu` for CUDA acceleration
Disk ~200MB additional For onnx + onnxruntime packages

Dependencies

Python Packages

  • `onnx` (listed in dev dependencies)
  • `onnxruntime` (listed in dev dependencies)
  • `onnxscript` (listed in dev dependencies, for advanced export)
  • All core Kornia dependencies (see PyTorch_Python_Environment)

Credentials

No credentials required for local ONNX operations. For loading models from HuggingFace Hub, `HF_TOKEN` may be needed for private models.

Quick Install

# Install ONNX dependencies
pip install onnx onnxruntime

# For GPU inference
pip install onnx onnxruntime-gpu

# Or install Kornia with all dev dependencies
pip install kornia[dev]

Code Evidence

Lazy loading of ONNX dependencies from `kornia/core/external.py:154-157`:

onnx = LazyLoader("onnx", dev_dependency=True)
onnxruntime = LazyLoader("onnxruntime")

LazyLoader auto-install behavior from `kornia/core/external.py:83-84`:

if kornia_config.lazyloader.installation_mode == InstallationMode.AUTO or self.auto_install:
    self._install_package(self.module_name)

ONNX Runtime provider selection based on device from `kornia/feature/lightglue_onnx/lightglue.py:70-76`:

if device.type == "cpu":
    providers = ["CPUExecutionProvider"]
elif device.type == "cuda":
    providers = ["CUDAExecutionProvider", "CPUExecutionProvider"]

Dev dependencies in `pyproject.toml:43-61`:

[project.optional-dependencies]
dev = [
  "onnx",
  "onnxruntime",
  "onnxscript",
  ...
]

Common Errors

Error Message Cause Solution
`ImportError: Optional dependency 'onnx' is not installed` onnx package missing `pip install onnx`
`ImportError: Optional dependency 'onnxruntime' is not installed` onnxruntime package missing `pip install onnxruntime`
`KORNIA_CHECK failed: FP16 requires CUDA` Attempting FP16 inference on CPU Use a CUDA device or switch to FP32 precision
`RuntimeError: CUDAExecutionProvider not available` onnxruntime-gpu not installed `pip install onnxruntime-gpu` instead of `onnxruntime`

Compatibility Notes

  • LazyLoader Modes: Set `kornia_config.lazyloader.installation_mode` to `"AUTO"` for automatic installation, `"ASK"` (default) for interactive prompts, or `"RAISE"` for strict mode that errors on missing deps.
  • onnxscript: A `DeprecationWarning` from `onnxscript.converter` is suppressed in pytest configuration.
  • GPU Inference: Requires `onnxruntime-gpu` package (not just `onnxruntime`) and CUDA drivers.

Related Pages

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment