Environment:BerriAI Litellm Python Runtime
| Knowledge Sources | |
|---|---|
| Domains | Infrastructure, LLM_Gateway |
| Last Updated | 2026-02-15 16:00 GMT |
Overview
Python 3.9+ runtime environment with core dependencies for the LiteLLM SDK library (httpx, openai, tiktoken, pydantic).
Description
This environment defines the minimum Python runtime and core package dependencies required to use the LiteLLM SDK for making LLM API calls. It covers the base litellm package without the optional proxy server or caching extras. The SDK supports Python 3.9 through 3.13 (and experimentally 3.14 for grpcio), running on Linux, macOS, and Windows.
Usage
Use this environment for any project that calls litellm.completion(), litellm.embedding(), or other SDK-level functions. This is the minimum prerequisite for all LiteLLM functionality. The proxy server, caching backends, and observability integrations require additional optional dependencies.
System Requirements
| Category | Requirement | Notes |
|---|---|---|
| OS | Linux, macOS, Windows | All platforms supported; some optional deps (uvloop, pyroscope-io) are Linux/macOS only |
| Python | >= 3.9, < 4.0 | 3.11 recommended for Docker deployments |
| Disk | 200MB | For package installation and tiktoken cache |
Dependencies
System Packages
- `python3` >= 3.9
- `pip` (package installer)
Python Packages (Core)
- `openai` >= 2.8.0
- `httpx` >= 0.23.0
- `tiktoken` >= 0.7.0
- `pydantic` >= 2.5.0, < 3.0.0
- `aiohttp` >= 3.10
- `jinja2` >= 3.1.2, < 4.0.0
- `jsonschema` >= 4.23.0, < 5.0.0
- `importlib-metadata` >= 6.8.0
- `python-dotenv` >= 0.2.0
- `fastuuid` >= 0.13.0
- `tokenizers` (any version)
- `click` (any version)
Credentials
The following environment variables are used for provider API access:
- `OPENAI_API_KEY`: OpenAI API key for GPT models.
- `ANTHROPIC_API_KEY`: Anthropic API key for Claude models.
- `LITELLM_MODE`: Set to "DEV" or "PRODUCTION" to control runtime behavior.
- `LITELLM_DROP_PARAMS`: Boolean flag to silently drop unsupported parameters.
- `LITELLM_MODEL_COST_MAP_URL`: Custom URL for model cost map (optional override).
Quick Install
# Install core LiteLLM SDK
pip install litellm
# Or with proxy server extras
pip install 'litellm[proxy]'
# Or with all extras for full deployment
pip install 'litellm[proxy,extra_proxy,caching]'
Code Evidence
Python version constraint from `pyproject.toml:22`:
python = ">=3.9,<4.0"
Core dependency declarations from `pyproject.toml:23-34`:
fastuuid = ">=0.13.0"
httpx = ">=0.23.0"
openai = ">=2.8.0"
python-dotenv = ">=0.2.0"
tiktoken = ">=0.7.0"
importlib-metadata = ">=6.8.0"
tokenizers = "*"
click = "*"
jinja2 = "^3.1.2"
aiohttp = ">=3.10"
pydantic = "^2.5.0"
jsonschema = ">=4.23.0,<5.0.0"
Lazy loading system to avoid heavy imports at startup from `litellm/__init__.py:1605`:
if os.getenv("LITELLM_DISABLE_LAZY_LOADING"):
# Load all modules eagerly
...
Common Errors
| Error Message | Cause | Solution |
|---|---|---|
| `ImportError: No module named 'openai'` | Core dependency not installed | `pip install litellm` |
| `pydantic.errors.PydanticImportError` | Pydantic v1 installed instead of v2 | `pip install 'pydantic>=2.5.0'` |
| `ImportError: cannot import name 'Version'` | Old importlib-metadata | `pip install 'importlib-metadata>=6.8.0'` |
Compatibility Notes
- Windows: `uvloop` and `pyroscope-io` are not available. The proxy server falls back to asyncio event loop.
- Python 3.14: Requires `grpcio >= 1.75.0` due to wheel availability. Versions 1.68.x-1.73.0 have known bugs.
- Python 3.9: Some optional packages like `mcp`, `a2a-sdk`, `polars`, and `mlflow` require Python >= 3.10.
Related Pages
- Implementation:BerriAI_Litellm_Completion
- Implementation:BerriAI_Litellm_Litellm_Global_Configuration
- Implementation:BerriAI_Litellm_Run_Server
- Implementation:BerriAI_Litellm_Proxy_Request_Processing
- Implementation:BerriAI_Litellm_Get_Llm_Provider
- Implementation:BerriAI_Litellm_Model_Response
- Implementation:BerriAI_Litellm_Exception_Mapping