Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Implementation:OpenGVLab InternVL Wrap Backbone LoRA

From Leeroopedia


Knowledge Sources
Domains Parameter_Efficient_Finetuning, Computer_Vision
Last Updated 2026-02-07 00:00 GMT

Overview

Concrete tool for injecting LoRA adapters into the InternViT vision encoder provided by the InternVL model class.

Description

The wrap_backbone_lora method on InternVLChatModel wraps the vision_model submodule with PEFT LoRA adapters targeting attention and MLP layers.

Usage

Called during training initialization when use_backbone_lora > 0 in ModelArguments. Typically used alongside wrap_llm_lora for full adapter fine-tuning.

Code Reference

Source Location

  • Repository: InternVL
  • File: internvl_chat/internvl/model/internvl_chat/modeling_internvl_chat.py
  • Lines: L111-119

Signature

def wrap_backbone_lora(self, r=128, lora_alpha=256, lora_dropout=0.05):
    """
    Wrap the vision encoder with LoRA adapters.

    Args:
        r: int - LoRA rank (default 128)
        lora_alpha: int - Scaling factor (convention: 2 * r)
        lora_dropout: float - Dropout on LoRA path (default 0.05)

    Target modules: ['attn.qkv', 'attn.proj', 'mlp.fc1', 'mlp.fc2']
    """

Import

from internvl.model.internvl_chat import InternVLChatModel
# Called as: model.wrap_backbone_lora(r=16, lora_alpha=32)

I/O Contract

Inputs

Name Type Required Description
r int No LoRA rank (default 128)
lora_alpha int No Scaling factor (default 256)
lora_dropout float No Dropout probability (default 0.05)

Outputs

Name Type Description
self.vision_model PeftModel Vision encoder with LoRA adapters; only adapter parameters trainable

Usage Examples

Inject Vision Encoder LoRA

model = InternVLChatModel.from_pretrained('OpenGVLab/InternVL2_5-8B')

# Freeze base vision encoder, then inject LoRA
for param in model.vision_model.parameters():
    param.requires_grad = False

model.wrap_backbone_lora(r=16, lora_alpha=32)
model.config.use_backbone_lora = 16

Related Pages

Implements Principle

Requires Environment

Uses Heuristic

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment