Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Implementation:Axolotl ai cloud Axolotl Do Merge Lora

From Leeroopedia


Knowledge Sources
Domains Model_Export, Parameter_Efficient_Finetuning
Last Updated 2026-02-06 23:00 GMT

Overview

Concrete tool for merging LoRA adapter weights into the base model and saving the result provided by the Axolotl framework.

Description

The do_merge_lora function loads a base model, loads the trained LoRA adapter on top of it, calls merge_and_unload() to fold the adapter weights into the base model weights, and saves the resulting standalone model. The merged model is saved to a merged/ subdirectory within the output directory.

Usage

Invoke via the Axolotl CLI: axolotl merge-lora config.yml. Can also be called programmatically after training completes.

Code Reference

Source Location

  • Repository: axolotl
  • File: src/axolotl/cli/merge_lora.py
  • Lines: L18-52

Signature

def do_merge_lora(
    *,
    cfg: DictDefault,
) -> None:
    """Merge LoRA adapter weights into base model and save.

    Loads the base model and LoRA adapter, calls merge_and_unload(),
    and saves the merged model to {cfg.output_dir}/merged/.

    Args:
        cfg: Configuration with output_dir (containing adapter), base_model,
             torch_dtype, and other model loading settings.
    """

Import

from axolotl.cli.merge_lora import do_merge_lora

I/O Contract

Inputs

Name Type Required Description
cfg DictDefault Yes Config with output_dir (pointing to trained LoRA adapter), base_model, torch_dtype

Outputs

Name Type Description
merged model Directory Standalone merged model saved to {cfg.output_dir}/merged/ in SafeTensors format

Usage Examples

CLI Usage

# Merge LoRA adapter after training
axolotl merge-lora examples/llama-3/qlora-1b.yml --output_dir ./my_trained_model

Programmatic Usage

from axolotl.cli.config import load_cfg
from axolotl.cli.merge_lora import do_merge_lora

cfg = load_cfg("examples/llama-3/qlora-1b.yml")
cfg.output_dir = "./my_trained_model"
do_merge_lora(cfg=cfg)
# Merged model now at ./my_trained_model/merged/

Related Pages

Implements Principle

Requires Environment

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment