Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Environment:Microsoft Autogen LLM Provider API Keys

From Leeroopedia
Knowledge Sources
Domains Infrastructure, Credentials
Last Updated 2026-02-11 18:00 GMT

Overview

API key environment variables required by AutoGen model clients to connect to OpenAI, Azure OpenAI, Anthropic, Google Gemini, and other LLM providers.

Description

AutoGen model clients read API keys from environment variables at runtime. Each LLM provider has its own key variable. The autogen-ext package provides model client wrappers for OpenAI, Azure OpenAI, Anthropic, Google Gemini, and LLaMA Cloud. All clients follow the pattern of checking environment variables as a fallback when no explicit `api_key` is passed to the constructor.

Usage

Use this environment whenever creating a model client (e.g., `OpenAIChatCompletionClient`, `AzureOpenAIChatCompletionClient`). Without the appropriate API key, any agent that depends on an LLM will fail at inference time.

System Requirements

Category Requirement Notes
Network Internet access Required for API calls to LLM providers
Python autogen-ext with provider extra e.g., `pip install autogen-ext[openai]`

Dependencies

Python Packages (by provider)

  • OpenAI: `openai` >= 1.93, `tiktoken` >= 0.8.0, `aiofiles`
  • Azure OpenAI: `azure-ai-inference` >= 1.0.0b9, `azure-identity`, `azure-core`
  • Anthropic: `anthropic` >= 0.48
  • Google Gemini: `google-genai` >= 1.0.0
  • Ollama (local): `ollama` >= 0.4.7, `tiktoken` >= 0.8.0
  • LLaMA CPP (local): `llama-cpp-python` >= 0.3.8

Credentials

The following environment variables must be set depending on which LLM provider you use:

OpenAI:

  • `OPENAI_API_KEY`: OpenAI API key for GPT models.

Azure OpenAI:

  • `AZURE_OPENAI_API_KEY`: Azure OpenAI service key.
  • `AZURE_PROJECT_ENDPOINT`: Azure AI project endpoint URL.

Anthropic:

  • `ANTHROPIC_API_KEY`: Anthropic API key for Claude models.

Google Gemini:

  • `GEMINI_API_KEY`: Google AI API key for Gemini models.

LLaMA Cloud:

  • `LLAMA_API_KEY`: LLaMA Cloud API key.

GitHub Models:

  • `GITHUB_TOKEN`: GitHub personal access token for GitHub-hosted models.

Quick Install

# Install with OpenAI support
pip install "autogen-ext[openai]>=0.7.5"

# Install with Azure support
pip install "autogen-ext[azure]>=0.7.5"

# Install with Anthropic support
pip install "autogen-ext[anthropic]>=0.7.5"

# Install with Gemini support
pip install "autogen-ext[gemini]>=0.7.5"

# Install with multiple providers
pip install "autogen-ext[openai,azure,anthropic,gemini]>=0.7.5"

Code Evidence

API key resolution from `autogen_ext/models/openai/_openai_client.py:1196`:

# OpenAI client uses OPENAI_API_KEY from environment
model_client = OpenAIChatCompletionClient(
    model="gpt-4o",
    # api_key = "your_openai_api_key"  # or set OPENAI_API_KEY env var
)

Azure OpenAI key from `autogen_ext/models/openai/_openai_client.py:1539`:

# Azure OpenAI uses AZURE_OPENAI_API_KEY
client = AzureOpenAIChatCompletionClient(
    azure_deployment="gpt-4o",
    model="gpt-4o",
    api_version="2024-06-01",
    azure_endpoint="https://your-resource.openai.azure.com/",
    # api_key read from AZURE_OPENAI_API_KEY
)

Provider-specific key checks from `autogen_ext/models/openai/_openai_client.py:1471-1481`:

# Gemini API key check
GEMINI_API_KEY  # checked at runtime

# Anthropic API key check
ANTHROPIC_API_KEY  # checked at runtime

# LLaMA API key check
LLAMA_API_KEY  # checked at runtime

Common Errors

Error Message Cause Solution
`openai.AuthenticationError: Incorrect API key` Invalid or missing OPENAI_API_KEY Set valid `OPENAI_API_KEY` environment variable
`anthropic.AuthenticationError` Invalid or missing ANTHROPIC_API_KEY Set valid `ANTHROPIC_API_KEY` environment variable
`azure.core.exceptions.ClientAuthenticationError` Invalid Azure credentials Set `AZURE_OPENAI_API_KEY` or configure Azure Identity

Compatibility Notes

  • Azure OpenAI: Supports both API key and Azure Identity (managed identity, service principal). The client automatically sets `User-Agent` to `autogen-python/{version}`.
  • Ollama: Runs locally, no API key required. Requires Ollama server running on the host.
  • LLaMA CPP: Runs locally, no API key required. Requires model file downloaded.
  • GitHub Models: Uses `GITHUB_TOKEN` for authentication with GitHub-hosted model endpoints.

Related Pages

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment