Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Environment:Huggingface Peft Python Core Dependencies

From Leeroopedia


Knowledge Sources
Domains Infrastructure, Deep_Learning
Last Updated 2026-02-07 06:44 GMT

Overview

Python 3.10+ environment with PyTorch >= 1.13.0, Transformers, Accelerate >= 0.21.0, and HuggingFace Hub >= 0.25.0. This is the minimum runtime required to use any PEFT adapter method.

Description

This environment defines the core dependency stack required by the PEFT library. All PEFT adapter methods (LoRA, AdaLoRA, Prompt Tuning, Prefix Tuning, P-Tuning, IA3, BOFT, OFT, etc.) require this base environment. The library targets Python 3.10 through 3.13 and depends on PyTorch for tensor operations, Transformers for model loading, Accelerate for distributed training utilities, and SafeTensors for secure weight serialization.

Usage

This environment is the mandatory prerequisite for every PEFT workflow. Any code that imports from the `peft` package requires these dependencies to be installed. It covers all six documented workflows: LoRA Causal LM Finetuning, QLoRA SFT Finetuning, Seq2Seq AdaLoRA Finetuning, DreamBooth LoRA Diffusion, Prompt Tuning Classification, and LoRA Embedding Semantic Search.

System Requirements

Category Requirement Notes
OS Linux / macOS / Windows OS Independent per classifiers
Python >= 3.10.0 Tested on 3.10, 3.11, 3.12, 3.13
Hardware CPU minimum GPU recommended for training (see GPU Hardware Detection environment)
Disk ~500MB For core packages; models require additional space

Dependencies

System Packages

  • `python` >= 3.10.0

Python Packages (Core)

  • `torch` >= 1.13.0
  • `transformers` (any version; specific features require >= 4.33.0)
  • `accelerate` >= 0.21.0
  • `numpy` >= 1.17
  • `packaging` >= 20.0
  • `psutil`
  • `pyyaml`
  • `tqdm`
  • `safetensors`
  • `huggingface_hub` >= 0.25.0

Python Packages (Test/Dev Extras)

  • `pytest`, `pytest-cov`, `pytest-xdist`
  • `datasets`, `diffusers`, `scipy`
  • `parameterized`, `protobuf`, `sentencepiece`

Credentials

The following environment variables may be needed depending on workflow:

  • `HF_TOKEN`: HuggingFace API token for downloading gated models or pushing to Hub.
  • `HF_HUB_OFFLINE`: Set to `"1"` to disable Hub downloads (offline mode).

Quick Install

# Install core PEFT with all required dependencies
pip install peft

# Or install from source
pip install torch>=1.13.0 transformers accelerate>=0.21.0 numpy>=1.17 packaging>=20.0 psutil pyyaml tqdm safetensors huggingface_hub>=0.25.0

# With test extras
pip install peft[test]

Code Evidence

Python version requirement from `setup.py:60`:

python_requires=">=3.10.0",

Core dependency list from `setup.py:61-72`:

install_requires=[
    "numpy>=1.17",
    "packaging>=20.0",
    "psutil",
    "pyyaml",
    "torch>=1.13.0",
    "transformers",
    "tqdm",
    "accelerate>=0.21.0",
    "safetensors",
    "huggingface_hub>=0.25.0",
]

Accelerate version check from `src/peft/utils/other.py:75`:

if version.parse(accelerate.__version__) >= version.parse("0.29.0"):
    from accelerate.utils import is_mlu_available
    mlu_available = is_mlu_available()

Transformers version check from `src/peft/tuners/lora/model.py:365-369`:

transformers_lt_4_52 = packaging.version.parse(
    transformers.__version__
) < packaging.version.parse("4.52.1")
if transformers_lt_4_52:
    raise ValueError("Using aLoRA requires transformers >= 4.52.1.")

DTensor support check from `src/peft/tuners/tuners_utils.py:61-62`:

_torch_supports_dtensor = version.parse(torch.__version__) >= version.parse("2.5.0")
_torch_supports_distributed = _torch_supports_dtensor and torch.distributed.is_available()

Common Errors

Error Message Cause Solution
`python_requires >= 3.10.0` Python version too old Upgrade to Python 3.10+
`Using aLoRA requires transformers >= 4.52.1` Transformers too old for aLoRA feature `pip install transformers>=4.52.1`
`gradient_checkpointing_kwargs is not supported` Transformers < 4.34.1 Upgrade transformers for gradient checkpointing kwargs support
`ImportError: accelerate` Accelerate not installed `pip install accelerate>=0.21.0`
`Your transformers version is too old, please upgrade it to >= 4.53.1` Using features that need newer transformers `pip install transformers>=4.53.1`

Compatibility Notes

  • Transformers versions: Different PEFT features require different minimum transformers versions:
    • >= 4.33.0 for DeepSpeed integration and AdaLoRA layer support
    • >= 4.34.1 for `gradient_checkpointing_kwargs`
    • >= 4.36.0 for quantized model generation
    • >= 4.38.0 for updated generation API
    • >= 4.52.1 for aLoRA (Activated LoRA)
    • >= 4.53.1 for latest model support
  • PyTorch >= 2.5.0: Required for DTensor / distributed tensor support
  • Accelerate >= 0.29.0: Required for MLU (Cambricon) hardware detection

Related Pages

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment