Implementation:ClickHouse ClickHouse Data Lakes Importer
Appearance
| Knowledge Sources | |
|---|---|
| Domains | Testing, Data_Lakes |
| Last Updated | 2026-02-08 00:00 GMT |
Overview
Utility for importing Parquet data into data lake formats (Iceberg, Delta, Hudi) for testing.
Description
Uses PySpark to convert Parquet files into Iceberg, Delta Lake, or Hudi formats for integration testing.
Usage
Use for generating test data in data lake formats, testing data lake integrations, or converting between formats.
Code Reference
Source Location
- Repository: ClickHouse
- File: utils/data-lakes-importer.py
- Lines: 1-120
Signature
def get_spark_for_iceberg(result_path)
def get_spark_for_delta()
def get_spark_for_hudi()
def main():
# Parses command line arguments
# Converts Parquet to specified data lake format
Import
#!/usr/bin/env python3
./data-lakes-importer.py <format> <input.parquet> <output_path>
Usage Examples
# Convert Parquet to Iceberg
./data-lakes-importer.py iceberg /data/input.parquet /data/iceberg_output
# Convert to Delta Lake
./data-lakes-importer.py delta /data/input.parquet /data/delta_output
# Convert to Hudi
./data-lakes-importer.py hudi /data/input.parquet /data/hudi_output
Related Pages
Page Connections
Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment