Implementation:TobikoData Sqlmesh CircleCI Continue Config
| Knowledge Sources | |
|---|---|
| Domains | CI/CD, Testing, DevOps |
| Last Updated | 2026-02-07 20:00 GMT |
Overview
CircleCI continuation configuration that defines a comprehensive CI/CD pipeline for SQLMesh with multi-engine testing, style checks, and cross-platform compatibility testing.
Description
The continue_config.yml file implements CircleCI's dynamic configuration feature to orchestrate a complex testing strategy. It defines multiple job types including style checks, documentation tests, unit tests on Windows, Docker-based engine tests, and cloud-based engine tests across 8+ database engines. The pipeline uses conditional execution via parameters (client, common, python) to selectively run tests based on code changes, optimizing CI resource usage. It includes sophisticated migration testing, UI testing with Playwright, and VSCode extension testing.
The configuration supports 16+ SQL engine adapters including DuckDB, Postgres, MySQL, MSSQL, Trino, Spark, ClickHouse, Snowflake, Databricks, BigQuery, Redshift, Athena, Fabric, and GCP Postgres, running them in parallel with matrix strategies and proper resource allocation.
Usage
This configuration is automatically triggered by CircleCI when a pull request is created or updated. The main workflow runs all jobs in parallel when appropriate, with cloud engine tests requiring successful completion of Docker-based engine tests first.
Code Reference
Source Location
- Repository: TobikoData_Sqlmesh
- File: .circleci/continue_config.yml
Pipeline Parameters
parameters:
client:
type: boolean
default: false
common:
type: boolean
default: false
python:
type: boolean
default: false
Key Jobs
jobs:
style_and_cicd_tests:
# Runs linters and code style checks
# Tests Python 3.9, 3.10, 3.11, 3.12, 3.13
engine_tests_docker:
# Tests: duckdb, postgres, mysql, mssql, trino, spark, clickhouse, risingwave
engine_tests_cloud:
# Tests: snowflake, databricks, redshift, bigquery, clickhouse-cloud, athena, fabric, gcp-postgres
# Only runs on main branch
I/O Contract
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
| pipeline.parameters.client | boolean | No | Enable client-specific tests (UI, frontend) |
| pipeline.parameters.common | boolean | No | Enable common tests affecting all components |
| pipeline.parameters.python | boolean | No | Enable Python/backend tests |
| pipeline.git.branch | string | Yes | Git branch name for conditional execution |
Outputs
| Name | Type | Description |
|---|---|---|
| test-results | Directory | JUnit XML test results stored for all jobs |
| pnpm-packages cache | Cache | Node.js dependencies cached by pnpm-lock.yaml checksum |
| CircleCI status checks | Status | Pass/fail status reported back to GitHub PR |
Workflow Stages
Stage 1: Fast Checks
- Documentation tests (cimg/python:3.10)
- Style checks across Python 3.9-3.13 with ruff, mypy, black
- UI style checks (ESLint, Prettier)
- VSCode extension tests
Stage 2: Core Testing
- Windows CICD tests with fast unit test suite
- Migration tests for sushi and sushi_dbt projects
- UI tests with Playwright (Microsoft's Playwright Docker image)
Stage 3: Docker Engine Tests
- Parallel execution across 8 engines
- Uses ubuntu-2404:2024.05.1 machine image with Docker layer caching
- 20-minute timeout for long-running tests
Stage 4: Cloud Engine Tests
- Requires successful Stage 3 completion
- Only runs on main branch
- Creates/tears down isolated test databases with UUID naming
- Uses sqlmesh_cloud_database_integration context for credentials
- Handles Snowflake private keys, BigQuery service accounts
- 20-minute timeout with automatic cleanup on failure
Commands
Conditional Execution Commands
halt_unless_core:
# Stops job unless common, python params are true OR branch is main
halt_unless_client:
# Stops job unless common, client params are true OR branch is main
Environment Variables
Style and CICD Tests
- PYTEST_XDIST_AUTO_NUM_WORKERS: 8 (parallel test execution)
Engine Tests
- SQLMESH__DISABLE_ANONYMIZED_ANALYTICS: "1"
- TEST_DB_NAME: Generated UUID-based database name
- SNOWFLAKE_PRIVATE_KEY_FILE: Path to decoded private key
Cloud Credentials
Managed via CircleCI context sqlmesh_cloud_database_integration:
- Snowflake: SNOWFLAKE_ACCOUNT, SNOWFLAKE_WAREHOUSE, etc.
- BigQuery: Service account JSON credentials
- Databricks: DATABRICKS_CATALOG, DATABRICKS_SERVER_HOSTNAME, etc.
- Redshift: REDSHIFT_HOST, REDSHIFT_USER, etc.
- Athena: AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY
Usage Examples
Running Specific Engine Test Locally
# DuckDB tests (no setup required)
make duckdb-test
# Snowflake tests (requires credentials)
export SNOWFLAKE_ACCOUNT=myaccount
export SNOWFLAKE_USER=myuser
export SNOWFLAKE_PASSWORD=mypassword
export SNOWFLAKE_DATABASE=test_db
export SNOWFLAKE_WAREHOUSE=warehouse
make snowflake-test
Running Style Checks
make install-dev
make py-style
UI Testing
cd web/client
pnpm install
npm run test