Environment:Astronomer Astronomer cosmos Python Airflow Runtime
| Knowledge Sources | |
|---|---|
| Domains | Infrastructure, Data_Orchestration |
| Last Updated | 2026-02-07 17:00 GMT |
Overview
Python 3.10+ environment with Apache Airflow >= 2.9.0, dbt-core, and supporting libraries for orchestrating dbt projects as Airflow DAGs.
Description
This environment defines the core runtime stack for astronomer-cosmos. It requires Python 3.10 or higher and Apache Airflow 2.9.0 or later. The stack includes version-parsing utilities (packaging >= 22.0), data validation (pydantic >= 1.10.0), template rendering (Jinja2 >= 3.0.0), and serialization (msgpack). A dbt-core installation with at least one database adapter is required for actual dbt command execution. The codebase includes extensive conditional logic to support both Airflow 2.x and Airflow 3.x, with import fallbacks for API changes between major versions.
Usage
Use this environment for all Cosmos workflows: DAG rendering, TaskGroup integration, local/remote dbt execution, documentation generation, and plugin hosting. Every Implementation page in the Cosmos wiki requires this base environment.
System Requirements
| Category | Requirement | Notes |
|---|---|---|
| OS | Linux, macOS, Windows (WSL) | Platform-independent per pyproject.toml classifiers |
| Python | >= 3.10 | Supported: 3.10, 3.11, 3.12, 3.13 |
| Airflow | >= 2.9.0 | Airflow 2.9.0 and 2.9.1 are partially supported (dataset breaking change) |
| Disk | Adequate for dbt project + cache | Cache dir defaults to system temp directory |
Dependencies
System Packages
- No OS-level system packages are strictly required beyond a standard Python installation
Python Packages (Core)
apache-airflow>= 2.9.0aenum(any version)attrs(any version)deprecation(any version, for Python 3.13 compatibility)Jinja2>= 3.0.0msgpack(any version)packaging>= 22.0pydantic>= 1.10.0virtualenv(any version)openlineage-integration-common(any version, required for Airflow datasets/assets)
Python Packages (dbt Adapters, Optional)
dbt-postgres,dbt-bigquery,dbt-snowflake,dbt-redshift,dbt-spark,dbt-databricks(!=1.9.0),dbt-duckdb,dbt-athena-community,dbt-clickhouse,dbt-exasol,dbt-mysql,dbt-oracle,dbt-starrocks,dbt-sqlserver,dbt-teradata,dbt-vertica(<=1.5.4)
Credentials
The following environment variables may be set:
ASTRONOMER_ENVIRONMENT: Set to "cloud" in Astro Cloud environments (auto-detected)OPENLINEAGE_NAMESPACE: OpenLineage namespace override (default: "cosmos")DO_NOT_TRACK: Opt out of telemetry when set to a truthy valueSCARF_NO_ANALYTICS: Opt out of Scarf analytics when set to a truthy valueTMPDIR: Override temp directory path (macOS users may need this for stable cache paths)
Quick Install
# Install Cosmos with a dbt adapter (example: postgres)
pip install "astronomer-cosmos[dbt-postgres]"
# Install with multiple adapters and cloud storage
pip install "astronomer-cosmos[dbt-postgres,amazon,google,microsoft]"
# Install all supported dbt adapters
pip install "astronomer-cosmos[all]"
Code Evidence
Python version requirement from pyproject.toml:11:
requires-python = ">=3.10"
Airflow version extraction and partially supported versions from cosmos/constants.py:10,35-37:
AIRFLOW_VERSION = Version(airflow.__version__)
# Cosmos will not emit datasets for the following Airflow versions, due to a breaking change
# that's fixed in later Airflow 2.x versions
# https://github.com/apache/airflow/issues/39486
PARTIALLY_SUPPORTED_AIRFLOW_VERSIONS = [Version("2.9.0"), Version("2.9.1")]
Airflow 3.x import fallback pattern used across the codebase (example from cosmos/cache.py:26-32):
try:
from airflow.sdk import ObjectStoragePath
except ImportError:
from airflow.io.path import ObjectStoragePath
Known dbt-databricks conflict from pyproject.toml:47-50:
# The dbt-databricks:1.9.0 version causes a dependency conflict with
# the Pydantic version required by Airflow (version > 2.7)
# See: https://github.com/astronomer/astronomer-cosmos/issues/1379
"dbt-databricks!=1.9.0",
Common Errors
| Error Message | Cause | Solution |
|---|---|---|
AirflowCompatibilityError |
Airflow version incompatible with requested feature | Upgrade Airflow to >= 2.9.0 |
CosmosConfigException: Unable to find the dbt executable |
dbt not installed or not on PATH | Install a dbt adapter package (e.g., pip install dbt-postgres)
|
| Dataset events not emitted | Airflow 2.9.0 or 2.9.1 breaking change | Upgrade to Airflow 2.9.2+ or 2.10+ |
ImportError: openlineage-integration-common |
Missing OpenLineage dependency | pip install openlineage-integration-common
|
Compatibility Notes
- Airflow 2.9.0 and 2.9.1: Dataset emission is disabled due to a known Airflow breaking change (apache/airflow#39486). Upgrade to 2.9.2+ for dataset support.
- Airflow 3.x: Full support via conditional imports. SDK module paths differ from Airflow 2 (e.g.,
airflow.sdk.bases.operator.BaseOperatorvsairflow.models.BaseOperator). - dbt-databricks 1.9.0: Excluded due to Pydantic version conflict with Airflow.
- dbt-vertica: Pinned to <= 1.5.4.
- gcsfs: Pinned to < 2025.3.0 due to fsspec compatibility issue.
- openlineage-integration-common: Version 1.15.0 excluded from the openlineage extra.
Related Pages
- Implementation:Astronomer_Astronomer_cosmos_ProjectConfig_Init
- Implementation:Astronomer_Astronomer_cosmos_ProfileConfig_Init
- Implementation:Astronomer_Astronomer_cosmos_RenderConfig_Init
- Implementation:Astronomer_Astronomer_cosmos_ExecutionConfig_Init
- Implementation:Astronomer_Astronomer_cosmos_DbtDag_Init
- Implementation:Astronomer_Astronomer_cosmos_DbtTaskGroup_Init
- Implementation:Astronomer_Astronomer_cosmos_DbtGraph_Load_And_Build
- Implementation:Astronomer_Astronomer_cosmos_DbtRunLocalOperator_Execute
- Implementation:Astronomer_Astronomer_cosmos_Watcher_Operators
- Implementation:Astronomer_Astronomer_cosmos_Task_Dependency_Wiring
- Implementation:Astronomer_Astronomer_cosmos_DbtDocsLocalOperator_Execute
- Implementation:Astronomer_Astronomer_cosmos_DbtDocsCloudOperator_Init
- Implementation:Astronomer_Astronomer_cosmos_Cosmos_Plugin
- Implementation:Astronomer_Astronomer_cosmos_Operator_Args_Pattern