Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Implementation:Huggingface Optimum TasksManager Determine Framework

From Leeroopedia
Field Value
Page Type Implementation
Source Repository https://github.com/huggingface/optimum
Source File optimum/exporters/tasks.py
Domains NLP, Computer_Vision, Export
Last Updated 2026-02-15 00:00 GMT

Overview

This implementation provides the TasksManager.determine_framework static method which inspects model weight files to detect whether a model uses PyTorch or TensorFlow.

API Reference

TasksManager.determine_framework

Source location: optimum/exporters/tasks.py, lines 580-659

Purpose: Determines the deep learning framework ("pt" or "tf") for a given model by inspecting available weight files in the repository or local directory.

Signature:

@staticmethod
def determine_framework(
    model_name_or_path: Union[str, Path],
    subfolder: str = "",
    revision: Optional[str] = None,
    cache_dir: str = HUGGINGFACE_HUB_CACHE,
    token: Optional[Union[bool, str]] = None,
) -> str:

Parameters:

Parameter Type Default Description
model_name_or_path Union[str, Path] required Hub model ID or path to a local directory containing the model.
subfolder str "" Subfolder within the model directory on the Hub.
revision Optional[str] None Specific model version (branch, tag, or commit ID).
cache_dir str HUGGINGFACE_HUB_CACHE Path to cached pretrained model weights.
token Optional[Union[bool, str]] None Authentication token for private Hub repos.

Returns: str -- Either "pt" (PyTorch) or "tf" (TensorFlow).

Raises:

  • ConnectionError -- If the framework could not be inferred and Hub connection failed
  • FileNotFoundError -- If no recognizable weight files are found
  • EnvironmentError -- If PyTorch is not installed in the environment

Detection Algorithm

The detection proceeds through the following steps:

  1. Retrieve file listing -- Calls TasksManager.get_model_files to get all files in the model repository. This first tries the Hub API, then falls back to a cached local snapshot.
  1. Check PyTorch weight patterns -- Constructs file matching patterns from WEIGHTS_NAME (pytorch_model.bin) and SAFE_WEIGHTS_NAME (model.safetensors). Files are checked against both the stem and extension of these standard names:
pt_weight_name = Path(WEIGHTS_NAME).stem          # "pytorch_model"
pt_weight_extension = Path(WEIGHTS_NAME).suffix    # ".bin"
safe_weight_name = Path(SAFE_WEIGHTS_NAME).stem    # "model"
safe_weight_extension = Path(SAFE_WEIGHTS_NAME).suffix  # ".safetensors"

is_pt_weight_file = [
    (file.startswith(pt_weight_name) and file.endswith(pt_weight_extension))
    or (file.startswith(safe_weight_name) and file.endswith(safe_weight_extension))
    for file in all_files
]
  1. Check diffusers pattern -- If model_index.json is present and any file ends with .bin or .safetensors, the framework is "pt".
  1. Check Sentence Transformers -- If config_sentence_transformers.json is found, the framework is "pt" (Sentence Transformers relies on PyTorch).
  1. Fallback -- If no framework-specific files are found, raises an error. If PyTorch is available in the environment, confirms "pt". If PyTorch is not installed, raises EnvironmentError.

Import

from optimum.exporters import TasksManager

Input/Output Summary

Input Output
Model name or path (string or Path) Framework string: "pt" or "tf"

Usage Example

from optimum.exporters import TasksManager

# Detect framework for a Hub model
framework = TasksManager.determine_framework("bert-base-uncased")
# Returns: "pt"

# Detect framework for a local model
framework = TasksManager.determine_framework("/path/to/local/model")
# Returns: "pt" or "tf" based on weight files present

Related Pages

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment