Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Principle:Eric mitchell Direct preference optimization Hydra Configuration

From Leeroopedia


Knowledge Sources
Domains Configuration_Management, Experiment_Management, Software_Engineering
Last Updated 2026-02-08 02:00 GMT

Overview

A hierarchical configuration management pattern that uses Hydra and OmegaConf to compose training hyperparameters from modular YAML files with CLI overrides.

Description

Hydra configuration provides a structured approach to managing the many hyperparameters involved in DPO and SFT training. The system uses:

  • Composable config groups: Separate YAML files for loss type (sft.yaml, dpo.yaml), model architecture (gpt2-large.yaml, pythia28.yaml, etc.), and shared training parameters (config.yaml)
  • CLI overrides: Any parameter can be overridden from the command line (e.g., loss.beta=0.1)
  • Mandatory parameters: Using OmegaConf's ??? marker to enforce required values (e.g., exp_name, loss.beta for DPO)
  • Custom resolvers: The get_local_run_dir resolver dynamically computes the output directory
  • Config serialization: The resolved config is saved to the run directory for reproducibility

Usage

Use this principle when configuring training runs. The Hydra system handles parameter composition, validation, and serialization, ensuring experiment reproducibility.

Theoretical Basis

Hierarchical configuration management follows the principle of separation of concerns: model-specific settings, loss-specific settings, and general training parameters are maintained independently and composed at runtime. This enables:

  • Systematic hyperparameter sweeps (changing only the relevant config group)
  • Clear documentation of what parameters each component requires
  • Reproducibility through config serialization

Pseudo-code:

# Abstract configuration flow (NOT actual implementation)
base_config = load_yaml("config/config.yaml")
loss_config = load_yaml(f"config/loss/{loss_name}.yaml")
model_config = load_yaml(f"config/model/{model_name}.yaml")
config = merge(base_config, loss_config, model_config, cli_overrides)
validate_no_missing_keys(config)
save_config(config, run_dir)

Related Pages

Implemented By

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment