Jump to content

Connect SuperML | Leeroopedia MCP: Equip your AI agents with best practices, code verification, and debugging knowledge. Powered by Leeroo — building Organizational Superintelligence. Contact us at founders@leeroo.com.

Implementation:ClickHouse ClickHouse Data Lakes Importer

From Leeroopedia


Knowledge Sources
Domains Testing, Data_Lakes
Last Updated 2026-02-08 00:00 GMT

Overview

Utility for importing Parquet data into data lake formats (Iceberg, Delta, Hudi) for testing.

Description

Uses PySpark to convert Parquet files into Iceberg, Delta Lake, or Hudi formats for integration testing.

Usage

Use for generating test data in data lake formats, testing data lake integrations, or converting between formats.

Code Reference

Source Location

Signature

def get_spark_for_iceberg(result_path)
def get_spark_for_delta()
def get_spark_for_hudi()

def main():
    # Parses command line arguments
    # Converts Parquet to specified data lake format

Import

#!/usr/bin/env python3
./data-lakes-importer.py <format> <input.parquet> <output_path>

Usage Examples

# Convert Parquet to Iceberg
./data-lakes-importer.py iceberg /data/input.parquet /data/iceberg_output

# Convert to Delta Lake
./data-lakes-importer.py delta /data/input.parquet /data/delta_output

# Convert to Hudi
./data-lakes-importer.py hudi /data/input.parquet /data/hudi_output

Related Pages

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment