Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Environment:Openai Openai agents python OpenAI API Credentials

From Leeroopedia
Knowledge Sources
Domains Infrastructure, Credentials
Last Updated 2026-02-11 14:00 GMT

Overview

Environment variables required for OpenAI API access, tracing, model selection, and SDK behavior configuration.

Description

The OpenAI Agents SDK reads several environment variables at runtime to configure API access, tracing behavior, data logging, and default model selection. The `OPENAI_API_KEY` is the primary credential used for both model invocation and trace export. Additional variables control tracing, sensitive data logging, and debug behavior.

Usage

Use this environment configuration whenever running any agent that calls OpenAI models or exports traces. The `OPENAI_API_KEY` is mandatory for any real model invocation. The remaining variables are optional and control SDK behavior.

System Requirements

Category Requirement Notes
OS Any Environment variables are OS-independent
Network Internet access Required for OpenAI API calls and trace export

Dependencies

Python Packages

  • `openai` >= 2.9.0 (uses the API key for model calls)

Credentials

The following environment variables are recognized by the SDK:

Required:

  • `OPENAI_API_KEY`: OpenAI API key for model invocation and trace export

Optional (Tracing):

  • `OPENAI_ORG_ID`: OpenAI organization ID for trace export
  • `OPENAI_PROJECT_ID`: OpenAI project ID for trace export
  • `OPENAI_AGENTS_DISABLE_TRACING`: Set to `"true"` or `"1"` to disable tracing entirely
  • `OPENAI_AGENTS_TRACE_INCLUDE_SENSITIVE_DATA`: Set to `"false"` to exclude sensitive data from traces (default: `"true"`)

Optional (Logging):

  • `OPENAI_AGENTS_DONT_LOG_MODEL_DATA`: Set to `"1"` or `"true"` to suppress LLM input/output in debug logs (default: enabled)
  • `OPENAI_AGENTS_DONT_LOG_TOOL_DATA`: Set to `"1"` or `"true"` to suppress tool input/output in debug logs (default: enabled)

Optional (Model Selection):

  • `OPENAI_DEFAULT_MODEL`: Override the default model name (default: `"gpt-4.1"`)

Optional (Codex Extension):

  • `CODEX_API_KEY`: API key for the experimental Codex tool (falls back to `OPENAI_API_KEY`)
  • `CODEX_PATH`: Override path to the Codex CLI binary
  • `OPENAI_AGENTS_CODEX_SUBPROCESS_STREAM_LIMIT_BYTES`: Stream buffer size limit for Codex subprocess

Optional (LiteLLM Extension):

  • `OPENAI_AGENTS_ENABLE_LITELLM_SERIALIZER_PATCH`: Set to `"true"` to enable LiteLLM serializer warning suppression

Quick Install

# Set the required API key
export OPENAI_API_KEY="sk-your-key-here"

# Optional: disable tracing
export OPENAI_AGENTS_DISABLE_TRACING="true"

# Optional: override default model
export OPENAI_DEFAULT_MODEL="gpt-4.1"

Code Evidence

API key resolution for tracing from `tracing/processors.py:91-92`:

@cached_property
def api_key(self):
    return self._api_key or os.environ.get("OPENAI_API_KEY")

Tracing disable flag from `tracing/provider.py:243-248`:

self._env_disabled = os.environ.get(
    "OPENAI_AGENTS_DISABLE_TRACING", "false"
).lower() in (
    "true",
    "1",
)

Debug data logging flags from `_debug.py:12-17`:

def _load_dont_log_model_data() -> bool:
    return _debug_flag_enabled("OPENAI_AGENTS_DONT_LOG_MODEL_DATA", default=True)

def _load_dont_log_tool_data() -> bool:
    return _debug_flag_enabled("OPENAI_AGENTS_DONT_LOG_TOOL_DATA", default=True)

Default model selection from `models/default_models.py:52-56`:

def get_default_model() -> str:
    """Returns the default model name."""
    return os.getenv(OPENAI_DEFAULT_MODEL_ENV_VARIABLE_NAME, "gpt-4.1").lower()

Common Errors

Error Message Cause Solution
`OPENAI_API_KEY is not set, skipping trace export` API key not configured Set `OPENAI_API_KEY` env var or pass key via `set_api_key()`
`[non-fatal] Tracing: server error {status_code}, retrying.` Transient OpenAI API error during trace export Automatic retry with backoff; no action needed
`Queue is full, dropping trace.` Trace export queue overflow under heavy load Increase throughput or reduce agent invocations

Compatibility Notes

  • API key priority: Programmatic `set_api_key()` overrides `OPENAI_API_KEY` env var. Per-span `tracing_api_key` overrides both.
  • Codex tool: Resolves API key as `CODEX_API_KEY` first, then falls back to `OPENAI_API_KEY`.
  • Sensitive data defaults: By default, trace data includes sensitive data (`OPENAI_AGENTS_TRACE_INCLUDE_SENSITIVE_DATA` defaults to `"true"`), but debug logging suppresses it (both `DONT_LOG` flags default to `True`).

Related Pages

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment