Implementation:Huggingface Optimum TasksManager Determine Framework
Appearance
| Field | Value |
|---|---|
| Page Type | Implementation |
| Source Repository | https://github.com/huggingface/optimum |
| Source File | optimum/exporters/tasks.py
|
| Domains | NLP, Computer_Vision, Export |
| Last Updated | 2026-02-15 00:00 GMT |
Overview
This implementation provides the TasksManager.determine_framework static method which inspects model weight files to detect whether a model uses PyTorch or TensorFlow.
API Reference
TasksManager.determine_framework
Source location: optimum/exporters/tasks.py, lines 580-659
Purpose: Determines the deep learning framework ("pt" or "tf") for a given model by inspecting available weight files in the repository or local directory.
Signature:
@staticmethod
def determine_framework(
model_name_or_path: Union[str, Path],
subfolder: str = "",
revision: Optional[str] = None,
cache_dir: str = HUGGINGFACE_HUB_CACHE,
token: Optional[Union[bool, str]] = None,
) -> str:
Parameters:
| Parameter | Type | Default | Description |
|---|---|---|---|
model_name_or_path |
Union[str, Path] |
required | Hub model ID or path to a local directory containing the model. |
subfolder |
str |
"" |
Subfolder within the model directory on the Hub. |
revision |
Optional[str] |
None |
Specific model version (branch, tag, or commit ID). |
cache_dir |
str |
HUGGINGFACE_HUB_CACHE |
Path to cached pretrained model weights. |
token |
Optional[Union[bool, str]] |
None |
Authentication token for private Hub repos. |
Returns: str -- Either "pt" (PyTorch) or "tf" (TensorFlow).
Raises:
ConnectionError-- If the framework could not be inferred and Hub connection failedFileNotFoundError-- If no recognizable weight files are foundEnvironmentError-- If PyTorch is not installed in the environment
Detection Algorithm
The detection proceeds through the following steps:
- Retrieve file listing -- Calls
TasksManager.get_model_filesto get all files in the model repository. This first tries the Hub API, then falls back to a cached local snapshot.
- Check PyTorch weight patterns -- Constructs file matching patterns from
WEIGHTS_NAME(pytorch_model.bin) andSAFE_WEIGHTS_NAME(model.safetensors). Files are checked against both the stem and extension of these standard names:
pt_weight_name = Path(WEIGHTS_NAME).stem # "pytorch_model"
pt_weight_extension = Path(WEIGHTS_NAME).suffix # ".bin"
safe_weight_name = Path(SAFE_WEIGHTS_NAME).stem # "model"
safe_weight_extension = Path(SAFE_WEIGHTS_NAME).suffix # ".safetensors"
is_pt_weight_file = [
(file.startswith(pt_weight_name) and file.endswith(pt_weight_extension))
or (file.startswith(safe_weight_name) and file.endswith(safe_weight_extension))
for file in all_files
]
- Check diffusers pattern -- If
model_index.jsonis present and any file ends with.binor.safetensors, the framework is"pt".
- Check Sentence Transformers -- If
config_sentence_transformers.jsonis found, the framework is"pt"(Sentence Transformers relies on PyTorch).
- Fallback -- If no framework-specific files are found, raises an error. If PyTorch is available in the environment, confirms
"pt". If PyTorch is not installed, raisesEnvironmentError.
Import
from optimum.exporters import TasksManager
Input/Output Summary
| Input | Output |
|---|---|
| Model name or path (string or Path) | Framework string: "pt" or "tf"
|
Usage Example
from optimum.exporters import TasksManager
# Detect framework for a Hub model
framework = TasksManager.determine_framework("bert-base-uncased")
# Returns: "pt"
# Detect framework for a local model
framework = TasksManager.determine_framework("/path/to/local/model")
# Returns: "pt" or "tf" based on weight files present
Related Pages
Page Connections
Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment