Environment:Apache Airflow Python Runtime Environment
| Knowledge Sources | |
|---|---|
| Domains | Infrastructure, Runtime |
| Last Updated | 2026-02-08 20:00 GMT |
Overview
Python 3.10+ runtime environment on Debian Bookworm with core dependencies including SQLAlchemy 2.0+, Pendulum 3.1+, Pydantic 2.11+, and FastAPI 0.128+.
Description
This environment defines the Python runtime and core library stack required to run Apache Airflow 3.x. The runtime requires Python 3.10 or higher (with 3.14 explicitly excluded). The default production Docker image uses Python 3.12.12 on Debian Bookworm Slim. Key dependencies include SQLAlchemy 2.0.36+ for async database operations, Pendulum 3.1.0+ for timezone-aware datetime handling, Pydantic 2.11.0+ for data validation, and FastAPI 0.128.1+ for the REST API server. The build system uses hatchling 1.27.0 with custom build plugins.
Usage
Use this environment for all Airflow core operations including DAG authoring, scheduler operation, task execution, and API server hosting. This is the foundational prerequisite for every Implementation in the Apache Airflow knowledge graph.
System Requirements
| Category | Requirement | Notes |
|---|---|---|
| OS | Debian Bookworm (12) Slim | Default Docker base image; other Linux distributions supported |
| Python | >= 3.10, != 3.14 | Supported: 3.10, 3.11, 3.12, 3.13 |
| Default Python | 3.12.12 | Production Docker image default |
| Disk | 2GB+ | For Airflow installation and dependencies |
Dependencies
System Packages
- `build-essential` (build time only)
- `git`
- `graphviz`, `graphviz-dev`
- `krb5-user`, `libkrb5-dev` (Kerberos support)
- `ldap-utils`, `libldap2-dev` (LDAP support)
- `libsasl2-2`, `libsasl2-dev`, `libsasl2-modules` (SASL)
- `libssl-dev`, `openssl`
- `libpq-dev` (PostgreSQL client development)
- `freetds-dev`, `freetds-bin` (MSSQL/SQL Server)
- `unixodbc`, `unixodbc-dev` (ODBC)
- `sqlite3`, `libsqlite3-dev`
- `libxmlsec1`
- `dumb-init` (container init process)
- `curl`, `wget`
Python Packages
- `sqlalchemy[asyncio]` >= 2.0.36
- `alembic` >= 1.13.1, < 2.0
- `pendulum` >= 3.1.0
- `pydantic` >= 2.11.0
- `fastapi[standard-no-fastapi-cloud-cli]` >= 0.128.1, < 0.128.4
- `cryptography` >= 41.0.0, < 46.0.0
- `aiosqlite` >= 0.20.0, < 0.22.0
- `importlib_metadata` >= 6.5 (Python < 3.12) or >= 7.0 (Python >= 3.12)
- `packaging` >= 25.0
- `pluggy` >= 1.6.0
Build System
- `hatchling` = 1.27.0
- `GitPython` = 3.1.45
- `pip` >= 26.0.1 (or `uv` >= 0.10.0 as alternative)
Optional Extras
- async: `eventlet` >= 0.37.0, `gevent` >= 25.4.1, `greenlet` >= 3.1.0
- kerberos: `pykerberos` >= 1.1.13, `requests-kerberos` >= 0.14.0
- gunicorn: `gunicorn` >= 23.0.0
- otel: `opentelemetry-exporter-prometheus` >= 0.47b0
- statsd: `statsd` >= 3.3.0
- graphviz: `graphviz` >= 0.20 (not supported on macOS)
- memray: `memray` >= 1.19.0
Credentials
The following environment variables control Airflow configuration:
- `AIRFLOW__DATABASE__SQL_ALCHEMY_CONN`: Database connection string (SQLAlchemy format)
- `AIRFLOW__DATABASE__SQL_ALCHEMY_CONN_ASYNC`: Async database connection string
- `AIRFLOW__<SECTION>__<KEY>`: Any configuration override following this pattern
- `AIRFLOW_CONN_<CONN_ID>`: Connection strings for external services
- `AIRFLOW_VAR_<KEY>`: Airflow Variables set via environment
- `_AIRFLOW_PATCH_GEVENT`: Enable gevent monkeypatching (advanced)
- `_AIRFLOW__AS_LIBRARY`: Skip initialization when using Airflow as a library
Quick Install
# Install Airflow with common extras
pip install "apache-airflow[celery,postgres,redis]==3.1.7" \
--constraint "https://raw.githubusercontent.com/apache/airflow/constraints-3.1.7/constraints-3.12.txt"
# Or using uv (recommended)
uv pip install "apache-airflow[celery,postgres,redis]==3.1.7" \
--constraint "https://raw.githubusercontent.com/apache/airflow/constraints-3.1.7/constraints-3.12.txt"
Code Evidence
Python version requirement from `pyproject.toml:44`:
requires-python = ">=3.10,!=3.14"
SQLite minimum version check from `airflow-core/src/airflow/configuration.py:480`:
min_sqlite_version = (3, 15, 0)
Airflow version from `airflow-core/src/airflow/__init__.py:28`:
__version__ = "3.2.0"
aiosqlite version constraint with reason from `airflow-core/pyproject.toml`:
# aiosqlite 0.22.0 hangs, see https://github.com/omnilib/aiosqlite/issues/310
"aiosqlite>=0.20.0,<0.22.0",
Common Errors
| Error Message | Cause | Solution |
|---|---|---|
| `requires-python >= 3.10` | Python version too old | Upgrade to Python 3.10 or higher |
| `ModuleNotFoundError: No module named 'airflow'` | Airflow not installed in active environment | `pip install apache-airflow` with constraints file |
| SQLite version error | SQLite < 3.15.0 | Upgrade system SQLite or use PostgreSQL/MySQL backend |
| `aiosqlite` hanging | aiosqlite 0.22.0 bug | Pin `aiosqlite<0.22.0` (done automatically by constraints) |
Compatibility Notes
- Python 3.14: Explicitly excluded (`!=3.14`) due to compatibility issues.
- macOS: The `graphviz` extra is not supported on macOS (excluded in pyproject.toml).
- FIPS Compliance: Python LTO can be disabled via `PYTHON_LTO="false"` build arg for FIPS-compliant builds.
- Package Manager: Both `pip` (26.0.1) and `uv` (0.10.0) are supported; `uv` is recommended for speed.
Related Pages
- Implementation:Apache_Airflow_StandaloneCommand_Entrypoint
- Implementation:Apache_Airflow_DAG_Constructor
- Implementation:Apache_Airflow_DagBag_Loader
- Implementation:Apache_Airflow_SchedulerJobRunner_Loop
- Implementation:Apache_Airflow_TaskInstance_Model
- Implementation:Apache_Airflow_BaseExecutor_Interface
- Implementation:Apache_Airflow_Execute_Workload
- Implementation:Apache_Airflow_TriggererJobRunner_Loop
- Implementation:Apache_Airflow_BaseOperator_Interface
- Implementation:Apache_Airflow_BaseHook_Interface
- Implementation:Apache_Airflow_ProvidersManager_Validation