Implementation:Astronomer Astronomer cosmos DbtDocsLocalOperator Execute
Metadata
| Field | Value |
|---|---|
| Page Type | Implementation |
| Knowledge Sources | Repo (astronomer-cosmos), Doc (dbt Docs) |
| Domains | Data_Engineering, Documentation |
| Last Updated | 2026-02-07 14:00 GMT |
Overview
Concrete tool for executing dbt docs generate within an Airflow pipeline provided by the astronomer-cosmos library. DbtDocsLocalOperator is a specialized operator that runs the dbt documentation generation command locally on the Airflow worker, producing the standard documentation artifacts (index.html, manifest.json, catalog.json) in the dbt target directory.
Description
DbtDocsLocalOperator extends DbtLocalBaseOperator to execute the dbt docs generate command. It inherits all the local execution infrastructure from its parent class, including profile rendering, virtual environment support, and dbt invocation mechanics.
The operator defines three required files that must be present in the target directory after successful execution:
index.html-- the browsable documentation UImanifest.json-- project metadata and DAGcatalog.json-- database schema information
If the --static flag is used, dbt produces a static_index.html instead, which bundles all artifacts into a single self-contained file.
This operator is typically used as the first step in a two-step documentation pipeline: generate the docs, then upload them to cloud storage using one of the DbtDocsCloudLocalOperator subclasses.
Code Reference
Source Location
| Source | File | Lines |
|---|---|---|
| astronomer-cosmos repo | cosmos/operators/local.py |
L1156-1179 |
Signature
class DbtDocsLocalOperator(DbtLocalBaseOperator):
"""Run the dbt docs generate command."""
required_files = ["index.html", "manifest.json", "catalog.json"]
base_cmd = ["docs", "generate"]
def __init__(self, **kwargs: Any) -> None:
super().__init__(**kwargs)
self.base_cmd = ["docs", "generate"]
Import
from cosmos.operators.local import DbtDocsLocalOperator
I/O Contract
Inputs
| Parameter | Type | Required | Description |
|---|---|---|---|
project_dir |
str | Yes | Path to the dbt project directory containing dbt_project.yml.
|
profile_config |
ProfileConfig | Yes | Configuration for the dbt profile, including database connection details and target name. |
install_deps |
bool | No | Whether to run dbt deps before generating docs. Defaults to True in the base operator.
|
**kwargs |
Any | No | Additional keyword arguments passed to DbtLocalBaseOperator, including env, vars, and Airflow operator parameters.
|
Outputs
| Output | Type | Description |
|---|---|---|
| Documentation artifacts | Files | Generated artifacts in the dbt target directory: index.html, manifest.json, and catalog.json. If --static flag is used, produces static_index.html instead.
|
Usage Examples
Basic Documentation Generation
from cosmos.operators.local import DbtDocsLocalOperator
from cosmos.profiles import PostgresUserPasswordProfileMapping
profile_config = ProfileConfig(
profile_name="my_project",
target_name="dev",
profile_mapping=PostgresUserPasswordProfileMapping(
conn_id="my_postgres_conn",
profile_args={"schema": "public"},
),
)
generate_docs = DbtDocsLocalOperator(
task_id="generate_dbt_docs",
project_dir="/usr/local/airflow/dags/dbt/my_project",
profile_config=profile_config,
install_deps=True,
)
Documentation Generation as Part of a DAG
from airflow import DAG
from datetime import datetime
from cosmos.operators.local import DbtDocsLocalOperator, DbtDocsS3LocalOperator
with DAG(
dag_id="dbt_docs_pipeline",
start_date=datetime(2024, 1, 1),
schedule_interval="@daily",
) as dag:
generate = DbtDocsLocalOperator(
task_id="generate_docs",
project_dir="/usr/local/airflow/dags/dbt/my_project",
profile_config=profile_config,
)
upload = DbtDocsS3LocalOperator(
task_id="upload_docs_to_s3",
project_dir="/usr/local/airflow/dags/dbt/my_project",
profile_config=profile_config,
aws_conn_id="my_aws_conn",
bucket_name="my-dbt-docs-bucket",
)
generate >> upload