Environment:CrewAIInc CrewAI LLM Provider Credentials
| Knowledge Sources | |
|---|---|
| Domains | Infrastructure, LLM_Integration |
| Last Updated | 2026-02-11 17:00 GMT |
Overview
API credentials environment for 12+ LLM providers supported by CrewAI, including OpenAI, Anthropic, Google Gemini, AWS Bedrock, and Azure.
Description
CrewAI supports multiple LLM providers through environment variables. At minimum, one provider's API key must be configured. The default provider is OpenAI, but the framework also has native providers for Anthropic, Google Gemini, AWS Bedrock, and Azure AI Inference. Additional providers are available through the LiteLLM integration layer. Each provider requires specific environment variables for authentication.
Usage
Use this environment for any CrewAI workflow that involves LLM calls. This covers virtually all Agent, Crew, and Flow operations. The specific credentials needed depend on which `model` string is configured on your Agents or Crew.
System Requirements
| Category | Requirement | Notes |
|---|---|---|
| Network | Internet access | Required for all cloud LLM API calls |
| Provider Account | At least one LLM provider account | OpenAI is the default provider |
Dependencies
Core (included in base install)
- `openai` >=1.83.0,<3 (default provider)
Optional Provider Packages
- `anthropic` ~=0.73.0 (install via `crewai[anthropic]`)
- `google-genai` ~=1.49.0 (install via `crewai[google-genai]`)
- `boto3` ~=1.40.45 (install via `crewai[bedrock]`)
- `azure-ai-inference` ~=1.0.0b9 (install via `crewai[azure-ai-inference]`)
- `litellm` >=1.74.9,<3 (install via `crewai[litellm]`)
- `ibm-watsonx-ai` ~=1.3.39 (install via `crewai[watson]`)
Credentials
OpenAI (Default Provider)
- `OPENAI_API_KEY`: OpenAI API key (required if using OpenAI models)
- `OPENAI_BASE_URL`: Custom base URL for OpenAI-compatible APIs
- `OPENAI_MODEL_NAME`: Override default model name
Anthropic
- `ANTHROPIC_API_KEY`: Anthropic API key (required for Claude models)
Google Gemini
- `GOOGLE_API_KEY` or `GEMINI_API_KEY`: Google API key
- `GOOGLE_CLOUD_PROJECT`: GCP project ID (for Vertex AI)
- `GOOGLE_CLOUD_LOCATION`: GCP region (defaults to `"us-central1"`)
- `GOOGLE_GENAI_USE_VERTEXAI`: Set to `"true"` to use Vertex AI instead of AI Studio
AWS Bedrock
- `AWS_ACCESS_KEY_ID`: AWS access key
- `AWS_SECRET_ACCESS_KEY`: AWS secret key
- `AWS_SESSION_TOKEN`: AWS session token (optional, for temporary credentials)
- `AWS_REGION_NAME`: AWS region
Azure
- `AZURE_API_KEY`: Azure API key
- `AZURE_ENDPOINT` or `AZURE_OPENAI_ENDPOINT` or `AZURE_API_BASE`: Azure endpoint URL
- `AZURE_API_VERSION`: API version (defaults to `"2024-06-01"`)
IBM Watson
- `WATSONX_URL`: WatsonX service URL
- `WATSONX_APIKEY`: WatsonX API key
- `WATSONX_PROJECT_ID`: WatsonX project ID
Other Providers
- `NVIDIA_NIM_API_KEY`: NVIDIA NIM API key
- `GROQ_API_KEY`: Groq API key
- `HF_TOKEN`: Hugging Face token
- `SAMBANOVA_API_KEY`: SambaNova API key
- `CEREBRAS_API_KEY`: Cerebras API key
Model Configuration
- `MODEL` or `MODEL_NAME`: Override default model name
- `BASE_URL` or `API_BASE`: Override default API base URL
Quick Install
# Default (OpenAI only)
uv add crewai
# With Anthropic native provider
uv add "crewai[anthropic]"
# With Google Gemini native provider
uv add "crewai[google-genai]"
# With AWS Bedrock native provider
uv add "crewai[bedrock]"
# With Azure AI Inference native provider
uv add "crewai[azure-ai-inference]"
# With LiteLLM for 100+ providers
uv add "crewai[litellm]"
Code Evidence
OpenAI API key requirement from `lib/crewai/src/crewai/llms/providers/openai/completion.py:336-338`:
self.api_key = os.getenv("OPENAI_API_KEY")
if self.api_key is None:
raise ValueError("OPENAI_API_KEY is required")
Anthropic API key requirement from `lib/crewai/src/crewai/llms/providers/anthropic/completion.py:225-227`:
self.api_key = os.getenv("ANTHROPIC_API_KEY")
if self.api_key is None:
raise ValueError("ANTHROPIC_API_KEY is required")
Gemini multi-key support from `lib/crewai/src/crewai/llms/providers/gemini/completion.py:113-119`:
api_key or os.getenv("GOOGLE_API_KEY") or os.getenv("GEMINI_API_KEY")
self.project = project or os.getenv("GOOGLE_CLOUD_PROJECT")
self.location = location or os.getenv("GOOGLE_CLOUD_LOCATION") or "us-central1"
use_vertexai = os.getenv("GOOGLE_GENAI_USE_VERTEXAI", "").lower() == "true"
Optional provider import pattern from `lib/crewai/src/crewai/llms/providers/anthropic/completion.py:23-31`:
try:
from anthropic import Anthropic, AsyncAnthropic, transform_schema
except ImportError:
raise ImportError(
'Anthropic native provider not available, to install: uv add "crewai[anthropic]"'
) from None
CLI provider enumeration from `lib/crewai/src/crewai/cli/constants.py:10-115`:
ENV_VARS: dict[str, list[dict[str, Any]]] = {
"openai": [{"prompt": "Enter your OPENAI API key", "key_name": "OPENAI_API_KEY"}],
"anthropic": [{"prompt": "Enter your ANTHROPIC API key", "key_name": "ANTHROPIC_API_KEY"}],
"gemini": [{"prompt": "Enter your GEMINI API key", "key_name": "GEMINI_API_KEY"}],
# ... 12+ providers total
}
Common Errors
| Error Message | Cause | Solution |
|---|---|---|
| `ValueError: OPENAI_API_KEY is required` | No OpenAI key set | Set `OPENAI_API_KEY` in `.env` or environment |
| `ValueError: ANTHROPIC_API_KEY is required` | No Anthropic key set | Set `ANTHROPIC_API_KEY` in `.env` or environment |
| `ImportError: Anthropic native provider not available` | Missing optional dependency | Run `uv add "crewai[anthropic]"` |
| `ImportError: AWS Bedrock native provider not available` | Missing boto3 | Run `uv add "crewai[bedrock]"` |
| `ImportError: Azure AI Inference native provider not available` | Missing azure-ai-inference | Run `uv add "crewai[azure-ai-inference]"` |
| `ImportError: Google Generative AI provider not available` | Missing google-genai | Run `uv add "crewai[google-genai]"` |
Compatibility Notes
- Default Provider: OpenAI is the default; its SDK is included in core dependencies
- Native Providers: Anthropic, Gemini, Bedrock, and Azure have native integrations with direct SDK calls
- LiteLLM Fallback: For providers without native support, install `crewai[litellm]` to access 100+ providers
- Ollama (Local): No API key needed; defaults to `http://localhost:11434` as base URL
- Environment Files: CrewAI loads `.env` files via `python-dotenv` automatically