Environment:CrewAIInc CrewAI Python Runtime Environment
| Knowledge Sources | |
|---|---|
| Domains | Infrastructure, AI_Agents |
| Last Updated | 2026-02-11 17:00 GMT |
Overview
Python 3.10-3.13 runtime environment with 23+ core dependencies for orchestrating autonomous AI agents using CrewAI.
Description
This environment defines the base Python runtime and core package dependencies required to run any CrewAI workflow. It is built on Python 3.10+ (up to 3.13) and includes Pydantic for data modeling, OpenAI SDK for default LLM communication, ChromaDB for vector storage, and OpenTelemetry for observability. The environment uses uv as its package manager and includes special PyTorch index configuration for Python 3.13 compatibility.
Usage
Use this environment for all CrewAI workflows. It is the mandatory base prerequisite for running any Agent, Task, Crew, Flow, or Knowledge operation. Every Implementation page in this wiki requires this environment.
System Requirements
| Category | Requirement | Notes |
|---|---|---|
| OS | Linux, macOS, or Windows | Cross-platform Python support |
| Python | >=3.10, <3.14 | Python 3.13 uses PyTorch nightly builds due to torch 2.5.0 incompatibility |
| Disk | 2GB+ | For core dependencies and ChromaDB vector storage |
| Network | Internet access | Required for LLM API calls and telemetry |
Dependencies
System Packages
- Python 3.10, 3.11, 3.12, or 3.13
- `uv` package manager (bundled as dependency)
Core Python Packages
- `pydantic` ~=2.11.9
- `openai` >=1.83.0,<3
- `instructor` >=1.3.3
- `pdfplumber` ~=0.11.4
- `regex` ~=2026.1.15
- `opentelemetry-api` ~=1.34.0
- `opentelemetry-sdk` ~=1.34.0
- `opentelemetry-exporter-otlp-proto-http` ~=1.34.0
- `chromadb` ~=1.1.0
- `tokenizers` ~=0.20.3
- `openpyxl` ~=3.1.5
- `python-dotenv` ~=1.1.1
- `pyjwt` >=2.9.0,<3
- `click` ~=8.1.7
- `appdirs` ~=1.4.4
- `jsonref` ~=1.1.0
- `json-repair` ~=0.25.2
- `tomli-w` ~=1.1.0
- `tomli` ~=2.0.2
- `json5` ~=0.10.0
- `portalocker` ~=2.7.0
- `pydantic-settings` ~=2.10.1
- `mcp` ~=1.26.0
- `aiosqlite` ~=0.21.0
Credentials
The following environment variables may be set (at minimum one LLM provider key is required):
- `OPENAI_API_KEY`: OpenAI API key (default LLM provider)
- `CREWAI_DISABLE_TELEMETRY`: Set to `"true"` to disable anonymous telemetry
- `CREWAI_DISABLE_TRACKING`: Alternative flag to disable telemetry
- `CREWAI_STORAGE_DIR`: Override default storage directory (defaults to current working directory name)
- `CREWAI_DISABLE_VERSION_CHECK`: Set to `"true"` to skip version check on startup
- `TOKENIZERS_PARALLELISM`: Automatically set to `"false"` by CrewAI to suppress fastembed logging
Quick Install
# Install CrewAI with core dependencies
pip install crewai
# Or with uv (recommended)
uv add crewai
Code Evidence
Python version requirement from `lib/crewai/pyproject.toml:9`:
requires-python = ">=3.10, <3.14"
Python 3.11+ conditional import from `lib/crewai/src/crewai/cli/utils.py:20`:
if sys.version_info >= (3, 11):
import tomllib # Python 3.11+ standard library
PyTorch compatibility workaround from `lib/crewai/pyproject.toml:110-125`:
# PyTorch index configuration, since torch 2.5.0 is not compatible with python 3.13
[[tool.uv.index]]
name = "pytorch-nightly"
url = "https://download.pytorch.org/whl/nightly/cpu"
explicit = true
[tool.uv.sources]
torch = [
{ index = "pytorch-nightly", marker = "python_version >= '3.13'" },
{ index = "pytorch", marker = "python_version < '3.13'" },
]
Telemetry disabling from `lib/crewai/src/crewai/telemetry/telemetry.py:149-151`:
os.getenv("OTEL_SDK_DISABLED", "false").lower() == "true"
or os.getenv("CREWAI_DISABLE_TELEMETRY", "false").lower() == "true"
or os.getenv("CREWAI_DISABLE_TRACKING", "false").lower() == "true"
Common Errors
| Error Message | Cause | Solution |
|---|---|---|
| `requires-python ">=3.10, <3.14"` | Python version out of range | Install Python 3.10, 3.11, 3.12, or 3.13 |
| `ModuleNotFoundError: No module named 'tomllib'` | Python 3.10 missing tomllib | The framework auto-falls back to `tomli`; ensure `tomli` is installed |
| `torch 2.5.0 is not compatible with python 3.13` | PyTorch incompatibility | Use `uv` with configured PyTorch nightly index, or use Python 3.12 |
Compatibility Notes
- Python 3.10: `tomllib` standard library module not available; uses `tomli` package as fallback
- Python 3.13: PyTorch requires nightly builds via special `uv` index configuration
- Windows: Supported but some CLI features may behave differently; uses `LOCALAPPDATA` for token storage
- CI Environments: Auto-detected via `CI` environment variable; adjusts console formatting accordingly
Related Pages
- Implementation:CrewAIInc_CrewAI_Crew_Constructor
- Implementation:CrewAIInc_CrewAI_Agent_Constructor
- Implementation:CrewAIInc_CrewAI_Task_Constructor
- Implementation:CrewAIInc_CrewAI_Crew_Kickoff
- Implementation:CrewAIInc_CrewAI_Flow_Decorators
- Implementation:CrewAIInc_CrewAI_Knowledge_Constructor
- Implementation:CrewAIInc_CrewAI_Memory_Subsystem
- Implementation:CrewAIInc_CrewAI_BaseTool_Schema
- Implementation:CrewAIInc_CrewAI_MCP_Server_Config