Principle:Predibase Lorax Adapter Merge Strategies
Appearance
| Knowledge Sources | |
|---|---|
| Domains | Model_Merging, Parameter_Efficient_Finetuning |
| Last Updated | 2026-02-08 02:00 GMT |
Overview
A family of algorithms for combining multiple LoRA adapter weight tensors into a single merged adapter, including linear interpolation, TIES (trim-elect-sign), and DARE (drop-and-rescale) methods.
Description
Adapter Merge Strategies address the challenge of combining knowledge from multiple fine-tuned adapters. Four strategies are supported:
- Linear: Weighted sum of adapter tensors. Simple and fast, but can lead to interference between conflicting parameters.
- TIES (Trim, Elect Sign, Disjoint Merge): Prunes low-magnitude parameters, resolves sign conflicts via majority voting, then merges with disjoint combination. Reduces interference.
- DARE Linear: Randomly drops parameters (with rescaling to preserve expected values), then performs linear merge. Based on the observation that most adapter parameters are redundant.
- DARE TIES: Combines DARE random pruning with TIES sign election and disjoint merge. Most sophisticated strategy.
Usage
Choose strategy based on adapter compatibility:
- Linear: When adapters are complementary (different domains, no conflicts)
- TIES: When adapters may have conflicting parameter signs
- DARE variants: When adapters are large-rank and parameter redundancy is expected
Theoretical Basis
Linear Merge
TIES Merge
- Trim: Prune smallest values by magnitude, keeping top density fraction
- Elect: Compute majority sign per parameter position across adapters
- Merge: Keep only values matching majority sign, average them
Pseudo-code:
# TIES algorithm
for each adapter:
pruned = prune_by_magnitude(delta_W, density)
majority_sign = sign(sum(sign(pruned) * abs(pruned)))
merged = disjoint_merge(pruned * weights, majority_sign)
DARE
- Drop: Randomly zero out parameters with probability (1 - density)
- Rescale: Multiply remaining by 1/density to preserve expected value
- Merge: Apply linear or TIES merge on sparsified tensors
Related Pages
Implemented By
Page Connections
Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment