Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Implementation:Predibase Lorax Strategy Registry

From Leeroopedia


Knowledge Sources
Domains Model_Merging, Parameter_Efficient_Finetuning
Last Updated 2026-02-08 02:00 GMT

Overview

Concrete tool for executing adapter merge strategies provided by the LoRAX strategy registry and MergeStrategy implementations.

Description

The strategy_registry maps strategy name strings to MergeStrategy subclasses: LinearMerge, TiesMerge, DareLinearMerge, and DareTiesMerge. Each implements a merge(task_tensors, weights) method that combines multiple adapter weight tensors into one. Helper functions in merges/utils.py provide pruning, majority sign calculation, and disjoint merge operations.

Usage

Used internally during adapter merging when merged_adapters is specified in a request. The strategy is selected by the merge_strategy parameter string.

Code Reference

Source Location

  • Repository: LoRAX
  • File: server/lorax_server/utils/merges/strategies.py
  • Lines: 38-105

Signature

class MergeStrategy(ABC):
    def merge(self, task_tensors: List[torch.Tensor], weights: torch.Tensor) -> torch.Tensor:
        raise NotImplementedError()

class LinearMerge(MergeStrategy):
    def merge(self, task_tensors, weights) -> torch.Tensor:
        """Weighted sum of task tensors."""

class TiesMerge(MergeStrategy):
    def __init__(self, density: float, majority_sign_method: str = "total"):
    def merge(self, task_tensors, weights) -> torch.Tensor:
        """Prune by magnitude, elect majority sign, disjoint merge."""

class DareLinearMerge(MergeStrategy):
    def __init__(self, density: float):
    def merge(self, task_tensors, weights) -> torch.Tensor:
        """Random prune with rescale, then linear merge."""

class DareTiesMerge(MergeStrategy):
    def __init__(self, density: float, majority_sign_method: str = "total"):
    def merge(self, task_tensors, weights) -> torch.Tensor:
        """Random prune with rescale, elect sign, disjoint merge."""

strategy_registry: Dict[str, Type[MergeStrategy]] = {
    "linear": LinearMerge,
    "ties": TiesMerge,
    "dare_linear": DareLinearMerge,
    "dare_ties": DareTiesMerge,
}

Import

from lorax_server.utils.merges.strategies import strategy_registry, MergeStrategy

I/O Contract

Inputs

Name Type Required Description
task_tensors List[torch.Tensor] Yes Per-adapter weight tensors to merge
weights torch.Tensor Yes Per-adapter blending weights
density float No Pruning density (for TIES/DARE strategies)
majority_sign_method str No "total" or "frequency" (for TIES/DARE_TIES)

Outputs

Name Type Description
merged_tensor torch.Tensor Single merged weight tensor

Usage Examples

Using Strategy Registry

import torch
from lorax_server.utils.merges.strategies import strategy_registry

# Create strategy instance
strategy = strategy_registry["ties"](density=0.7, majority_sign_method="total")

# Merge adapter tensors
tensor_a = torch.randn(768, 16)  # adapter A weights
tensor_b = torch.randn(768, 16)  # adapter B weights
weights = torch.tensor([0.6, 0.4])

merged = strategy.merge([tensor_a, tensor_b], weights)
# merged shape: [768, 16]

Related Pages

Implements Principle

Requires Environment

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment