Implementation:Astronomer Astronomer cosmos SparkThriftProfileMapping
| Knowledge Sources | |
|---|---|
| Domains | Profile_Mapping, Spark |
| Last Updated | 2026-02-07 17:00 GMT |
Overview
Concrete tool for mapping Airflow spark connections to dbt spark profiles provided by astronomer-cosmos.
Description
The SparkThriftProfileMapping maps an Airflow spark connection to a dbt spark profile. It translates connection parameters (host, login, password, schema, port, extras) into the YAML structure that dbt expects in `profiles.yml`. This variant targets Spark via the Thrift server protocol.
Usage
Use this profile mapping when configuring Cosmos to run dbt commands against a Spark backend. Assign it to `ProfileConfig(profile_mapping=...)` when the target Airflow connection uses the `spark` connection type with Thrift server access.
Code Reference
Source Location
- Repository: Astronomer_Astronomer_cosmos
- File: cosmos/profiles/spark/thrift.py
Signature
class SparkThriftProfileMapping(BaseProfileMapping):
airflow_connection_type: str = "spark"
dbt_profile_type: str = "spark"
Import
from cosmos.profiles.spark import SparkThriftProfileMapping
I/O Contract
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
| conn_id | str | Yes | Airflow connection ID for Spark |
Outputs
| Name | Type | Description |
|---|---|---|
| profile | dict | dbt profile YAML dictionary |
| env_vars | dict | Environment variables for secret fields |
Usage Examples
from cosmos.config import ProfileConfig
from cosmos.profiles.spark import SparkThriftProfileMapping
profile_config = ProfileConfig(
profile_name="default",
target_name="dev",
profile_mapping=SparkThriftProfileMapping(conn_id="spark_default"),
)