Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Implementation:BerriAI Litellm Get Llm Provider

From Leeroopedia
Knowledge Sources BerriAI/litellm - litellm/litellm_core_utils/get_llm_provider_logic.py
Domains LLM Integration, Provider Resolution, Routing
Last Updated 2026-02-15

Overview

Concrete tool for resolving a model name string to the correct LLM provider, provided by the litellm Python package via the get_llm_provider() function in litellm/litellm_core_utils/get_llm_provider_logic.py.

Description

The get_llm_provider() function is the central model-to-provider resolution engine in LiteLLM. Given a model string (e.g., "azure/chatgpt-v-2", "anthropic/claude-3-opus-20240229", or "gpt-4"), it determines:

  • The provider-native model identifier (stripped of any provider prefix).
  • The custom LLM provider name (e.g., "azure", "anthropic", "openai").
  • An optional dynamic API key resolved from environment variables.
  • An optional API base URL for the target endpoint.

The function implements a multi-stage resolution pipeline: explicit provider override, slash-prefix parsing against the provider registry, known-endpoint matching, JSON-configured provider lookup, model-list scanning, and fallback error raising.

Usage

This function is called internally by litellm.completion() and litellm.acompletion() to determine which provider handler to invoke. It can also be called directly to inspect provider resolution without making an API call.

Code Reference

Source Location: litellm/litellm_core_utils/get_llm_provider_logic.py, lines 99-939

Signature:

def get_llm_provider(
    model: str,
    custom_llm_provider: Optional[str] = None,
    api_base: Optional[str] = None,
    api_key: Optional[str] = None,
    litellm_params: Optional[LiteLLM_Params] = None,
) -> Tuple[str, str, Optional[str], Optional[str]]:
    """
    Returns the provider for a given model name - e.g. 'azure/chatgpt-v-2' -> 'azure'

    For router -> Can also give the whole litellm param dict ->
    this function will extract the relevant details

    Raises Error - if unable to map model to a provider

    Return model, custom_llm_provider, dynamic_api_key, api_base
    """

Import:

from litellm.litellm_core_utils.get_llm_provider_logic import get_llm_provider

I/O Contract

Inputs

Parameter Type Description
model str Required. The model identifier, optionally prefixed with a provider name separated by / (e.g., "openai/gpt-4", "anthropic/claude-3-opus-20240229", "gpt-4").
custom_llm_provider Optional[str] Explicit provider override. If set, bypasses prefix-based parsing.
api_base Optional[str] A custom API base URL. Matched against known provider endpoints for resolution.
api_key Optional[str] An API key, which may be a literal key or an os.environ/-prefixed reference to an environment variable.
litellm_params Optional[LiteLLM_Params] A parameter object from the Router system that bundles custom_llm_provider, api_base, and api_key together. When provided, the individual parameters must be None.

Outputs

Output Type Description
model str The provider-native model identifier, with the provider prefix stripped (e.g., "chatgpt-v-2" from "azure/chatgpt-v-2").
custom_llm_provider str The resolved provider name (e.g., "azure", "openai", "anthropic").
dynamic_api_key Optional[str] An API key resolved from environment variables (when the input key used os.environ/ notation), or None.
api_base Optional[str] The resolved API base URL, or None if not applicable.

Usage Examples

Basic provider resolution:

from litellm.litellm_core_utils.get_llm_provider_logic import get_llm_provider

# Slash-prefixed model string
model, provider, api_key, api_base = get_llm_provider("azure/chatgpt-v-2")
# model = "chatgpt-v-2", provider = "azure", api_key = None, api_base = None

# Bare model string (resolved via model list lookup)
model, provider, api_key, api_base = get_llm_provider("gpt-4")
# model = "gpt-4", provider = "openai", api_key = None, api_base = None

With explicit provider override:

model, provider, api_key, api_base = get_llm_provider(
    model="my-custom-model",
    custom_llm_provider="openai",
    api_base="https://my-endpoint.example.com/v1"
)
# model = "openai/my-custom-model", provider = "openai"

With environment variable key resolution:

model, provider, api_key, api_base = get_llm_provider(
    model="anthropic/claude-3-opus-20240229",
    api_key="os.environ/ANTHROPIC_API_KEY"
)
# api_key is resolved to the value of os.environ["ANTHROPIC_API_KEY"]

Related Pages

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment