Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Implementation:Huggingface Diffusers PeftAdapterMixin Add Adapter

From Leeroopedia
Knowledge Sources
Domains Diffusion_Models, Parameter_Efficient_Finetuning, LoRA
Last Updated 2026-02-13 21:00 GMT

Overview

Concrete tool for injecting PEFT adapter layers (such as LoRA) into Diffusers model components, provided by the PeftAdapterMixin.add_adapter method.

Description

The add_adapter method on PeftAdapterMixin (which UNet2DConditionModel and other Diffusers models inherit) takes a PEFT configuration object and injects the corresponding adapter layers into the model. For LoRA, this means wrapping the specified target linear layers with parallel low-rank matrices. The method delegates to PEFT's inject_adapter_in_model function, which traverses the model's module tree, identifies layers matching the target_modules pattern, and replaces them with LoRA-augmented versions.

After injection, the adapter is activated via set_adapter. The original weights remain frozen (as set by the prior requires_grad_(False) call), and only the newly injected LoRA parameters are trainable. When using mixed precision training with fp16, the LoRA parameters must be explicitly cast back to float32 via cast_training_params to maintain numerical stability during optimizer updates.

Usage

Use add_adapter when:

  • Setting up LoRA fine-tuning for diffusion models
  • Adding parameter-efficient adapters to UNet or text encoder components
  • You need to manage multiple named adapters on the same model
  • Integrating with the PEFT library's adapter configuration system

Code Reference

Source Location

  • Repository: diffusers
  • File: src/diffusers/loaders/peft.py
  • Lines: 526-562

Signature

def add_adapter(
    self,
    adapter_config,
    adapter_name: str = "default",
) -> None:

Import

from peft import LoraConfig
from diffusers import UNet2DConditionModel

# LoRA config is passed to add_adapter on the model instance
unet.add_adapter(unet_lora_config)

I/O Contract

Inputs

Name Type Required Description
adapter_config PeftConfig (e.g., LoraConfig) Yes Configuration object specifying the adapter type and hyperparameters. For LoRA, this includes r, lora_alpha, target_modules, and init_lora_weights.
adapter_name str No Name identifier for the adapter. Defaults to "default". Must be unique if multiple adapters are added.

LoRA Configuration Parameters

Name Type Required Description
r int Yes Rank of the low-rank decomposition. Common values: 4, 8, 16, 32, 64. Lower rank means fewer trainable parameters.
lora_alpha int Yes Scaling factor for the LoRA output. The effective scale is lora_alpha / r. Often set equal to r.
init_lora_weights str or bool No Weight initialization strategy. "gaussian" uses Gaussian initialization for matrix A and zeros for matrix B. True uses default PEFT initialization.
target_modules list[str] No List of module name patterns to apply LoRA to. For UNet: ["to_k", "to_q", "to_v", "to_out.0"].

Outputs

Name Type Description
(modifies model in-place) None The model is modified in-place with LoRA layers injected into the target modules. The adapter is automatically activated.

Usage Examples

Basic Usage

from diffusers import UNet2DConditionModel
from peft import LoraConfig

# Load and freeze the UNet
unet = UNet2DConditionModel.from_pretrained(
    "stable-diffusion-v1-5/stable-diffusion-v1-5",
    subfolder="unet",
)
unet.requires_grad_(False)

# Configure LoRA
unet_lora_config = LoraConfig(
    r=4,
    lora_alpha=4,
    init_lora_weights="gaussian",
    target_modules=["to_k", "to_q", "to_v", "to_out.0"],
)

# Inject LoRA adapters into the UNet
unet.add_adapter(unet_lora_config)

# For fp16 mixed precision, cast LoRA params back to float32
from diffusers.training_utils import cast_training_params
cast_training_params(unet, dtype=torch.float32)

# Collect trainable parameters for the optimizer
lora_layers = filter(lambda p: p.requires_grad, unet.parameters())

Related Pages

Implements Principle

Requires Environment

Uses Heuristic

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment