Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Implementation:Huggingface Optimum Is Backend Available

From Leeroopedia

Overview

Implementation of backend availability detection functions that check whether hardware acceleration backends (ONNX Runtime, OpenVINO, IPEX) are installed in the current Python environment.

Source

File: optimum/utils/import_utils.py

Repository: optimum

APIs

Primary Availability Functions

Function Lines Description
is_onnxruntime_available() -> bool L182-183 Checks if any of 17 ONNX Runtime distribution variants are installed. Returns cached _onnxruntime_available flag.
is_openvino_available() -> bool L241-242 Checks if OpenVINO is installed. Returns cached _openvino_available flag.
is_ipex_available() -> bool L237-238 Checks if Intel Extension for PyTorch is installed. Returns cached _ipex_available flag.

Helper Function

def _is_package_available(
    pkg_name: str,
    return_version: bool = False,
    pkg_distributions: Optional[List[str]] = None,
) -> Union[Tuple[bool, str], bool]:

Lines: L40-79

Description: Core detection function that performs a two-phase check:

  1. Uses importlib.util.find_spec(pkg_name) to check if the package is importable
  2. Iterates over pkg_distributions using importlib.metadata.version() to verify the package is properly installed

Module-Level Caching

The availability results are computed once at module load time and stored in module-level variables:

_openvino_available = _is_package_available("openvino")                    # L87
_ipex_available = _is_package_available("intel_extension_for_pytorch")     # L92
_onnxruntime_available = _is_package_available(                            # L97-118
    "onnxruntime",
    pkg_distributions=[
        "onnxruntime-gpu",
        "onnxruntime-rocm",
        "onnxruntime-training",
        "onnxruntime-training-rocm",
        "onnxruntime-training-cpu",
        "onnxruntime-openvino",
        "onnxruntime-vitisai",
        "onnxruntime-armnn",
        "onnxruntime-cann",
        "onnxruntime-dnnl",
        "onnxruntime-acl",
        "onnxruntime-tvm",
        "onnxruntime-qnn",
        "onnxruntime-migraphx",
        "ort-migraphx-nightly",
        "ort-rocm-nightly",
    ],
)

Import

from optimum.utils import is_onnxruntime_available, is_openvino_available, is_ipex_available

I/O

Direction Type Description
Input None Reads cached module-level boolean flags set at import time
Output bool True if the corresponding backend package is installed and available, False otherwise

Usage Example

from optimum.utils import is_onnxruntime_available, is_openvino_available, is_ipex_available

# Check individual backends
print(f"ONNX Runtime available: {is_onnxruntime_available()}")
print(f"OpenVINO available: {is_openvino_available()}")
print(f"IPEX available: {is_ipex_available()}")

# Conditional backend selection
if is_openvino_available():
    from optimum.intel import OVModelForSequenceClassification
elif is_onnxruntime_available():
    from optimum.onnxruntime import ORTModelForSequenceClassification

Internal Details

The _is_package_available helper function at L40-79 follows this logic:

  1. Call importlib.util.find_spec(pkg_name) to check if the module is importable
  2. If pkg_distributions is provided, append pkg_name itself to the list
  3. Iterate through all distribution names, calling importlib.metadata.version(pkg) on each
  4. Return True on the first successful version retrieval; False if none succeed
  5. If return_version=True, return a tuple of (bool, str) with the version string

Related

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment