Implementation:Astronomer Astronomer cosmos Aws Eks Operators
| Knowledge Sources | |
|---|---|
| Domains | AWS, EKS, Operators |
| Last Updated | 2026-02-07 17:00 GMT |
Overview
Provides Airflow operators for running dbt commands on AWS Elastic Kubernetes Service (EKS) by extending the Cosmos Kubernetes base operator with EKS-specific authentication and cluster configuration.
Description
The Aws_Eks_Operators module defines DbtAwsEksBaseOperator, which inherits from DbtKubernetesBaseOperator. Rather than combining two unrelated base classes, this operator builds on the existing Kubernetes operator infrastructure within Cosmos and adds AWS EKS-specific concerns: cluster name resolution, region selection, and AWS connection handling. At execution time the operator authenticates against the specified EKS cluster using the configured AWS connection, generates or retrieves a kubeconfig, and then delegates pod creation and dbt command execution to the underlying Kubernetes machinery.
Eight concrete operator classes cover the standard dbt sub-commands: build, ls, seed, snapshot, run, test, run-operation, and clone. Each concrete class sets its own ui_color and ui_fgcolor for visual distinction in the Airflow UI and inherits all execution logic from the base.
Usage
Use the AWS EKS operators when your dbt workloads should run as Kubernetes pods on an EKS cluster. This is well-suited for teams that already operate Kubernetes-based infrastructure on AWS and want to leverage existing cluster autoscaling, IAM roles for service accounts, and pod-level resource limits for dbt execution. Compared to the ECS operators, the EKS operators offer finer-grained pod configuration such as custom volumes, init containers, and sidecar patterns.
Code Reference
Source Location
- Repository: Astronomer_Astronomer_cosmos
- File: cosmos/operators/aws_eks.py
Signature
class DbtAwsEksBaseOperator(DbtKubernetesBaseOperator):
"""
Base class for running dbt commands as pods on AWS EKS.
"""
def __init__(
self,
cluster_name: str,
pod_name: str = "dbt",
namespace: str = "default",
aws_conn_id: str = "aws_default",
region: str | None = None,
**kwargs,
) -> None:
...
Concrete operators:
class DbtBuildAwsEksOperator(DbtAwsEksBaseOperator): ...
class DbtLSAwsEksOperator(DbtAwsEksBaseOperator): ...
class DbtSeedAwsEksOperator(DbtAwsEksBaseOperator): ...
class DbtSnapshotAwsEksOperator(DbtAwsEksBaseOperator): ...
class DbtRunAwsEksOperator(DbtAwsEksBaseOperator): ...
class DbtTestAwsEksOperator(DbtAwsEksBaseOperator): ...
class DbtRunOperationAwsEksOperator(DbtAwsEksBaseOperator): ...
class DbtCloneAwsEksOperator(DbtAwsEksBaseOperator): ...
Import
from cosmos.operators.aws_eks import DbtRunAwsEksOperator
I/O Contract
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
| cluster_name | str | Yes | The name of the EKS cluster where the dbt pod will be scheduled. |
| pod_name | str | No | The name assigned to the Kubernetes pod. Defaults to "dbt".
|
| namespace | str | No | The Kubernetes namespace in which to launch the pod. Defaults to "default".
|
| aws_conn_id | str | No | Airflow connection ID for AWS credentials used to authenticate with EKS. Defaults to "aws_default".
|
| region | str or None | No | The AWS region where the EKS cluster resides. If not provided, it is inferred from the connection or environment. |
Outputs
| Name | Type | Description |
|---|---|---|
| pod_name | str | The name of the Kubernetes pod that executed the dbt command, available via XCom. |
| pod_namespace | str | The namespace of the Kubernetes pod that was created. |
| log_output | str | Standard output and error from the dbt command, captured from the pod logs. |
Usage Examples
from cosmos.operators.aws_eks import DbtRunAwsEksOperator
run_dbt = DbtRunAwsEksOperator(
task_id="dbt_run_on_eks",
cluster_name="analytics-eks-cluster",
pod_name="dbt-run-pod",
namespace="data-pipelines",
aws_conn_id="aws_default",
region="us-east-1",
image="my-dbt-image:latest",
)
from cosmos.operators.aws_eks import DbtSeedAwsEksOperator
seed_dbt = DbtSeedAwsEksOperator(
task_id="dbt_seed_on_eks",
cluster_name="analytics-eks-cluster",
namespace="data-pipelines",
image="my-dbt-image:latest",
)