Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Principle:PrefectHQ Prefect Flow Orchestration

From Leeroopedia


Metadata
Sources
Domains
Last Updated 2026-02-09 00:00 GMT

Overview

A decorator-based mechanism that transforms standard Python functions into observable, resilient workflow entry points with automatic state tracking, logging, and failure recovery.

Description

The @flow decorator is Prefect's fundamental building block for workflow orchestration. It wraps a Python function to provide:

  • Automatic state tracking -- each flow run transitions through well-defined states (Pending, Running, Completed, Failed)
  • Parameter validation -- flow arguments are validated automatically
  • log_prints -- capturing print() statements as structured logs
  • Nested flow support -- flows can call other flows (subflows) for composability
  • Integration with Prefect's deployment system -- flows can be scheduled, triggered via API, and monitored in the Prefect UI

Flows can be synchronous or asynchronous and may call tasks or other flows internally. Every invocation of a flow creates a flow run -- a tracked, logged, observable unit of execution.

Usage

Use @flow when you need a top-level entry point for:

  • A data pipeline or ETL process
  • An ML training or inference job
  • Any multi-step process that benefits from observability, retries, and scheduling

A typical pattern is to define a flow that orchestrates multiple tasks:

from prefect import flow, task

@task
def extract():
    ...

@task
def transform(data):
    ...

@task
def load(data):
    ...

@flow(name="my_etl", log_prints=True)
def etl_pipeline():
    raw = extract()
    cleaned = transform(raw)
    load(cleaned)

Theoretical Basis

The flow decorator implements the Orchestration Pattern -- a centralized controller that manages the execution of a series of discrete steps. Unlike bare function calls, orchestrated flows provide:

  • State machine semantics -- each run transitions through well-defined states, enabling monitoring and alerting
  • Automatic retry on failure -- transient errors can be recovered from without manual intervention
  • Event emission for monitoring -- flow runs emit events that the Prefect UI and API can consume
  • Parameterized execution for reusability -- the same flow definition can be invoked with different inputs
Aspect Bare Function @flow Decorated Function
State tracking None Pending -> Running -> Completed/Failed
Retry on failure Manual Automatic with configurable policy
Logging Manual print/logging Structured, captured automatically
Observability None Full UI visibility, events, alerts
Scheduling Manual (cron, etc.) Built-in via Prefect deployments

The Orchestration Pattern is particularly valuable in data engineering and ML workflows where:

  • Operations are long-running and may fail partway through
  • Visibility into execution progress is critical
  • Reproducibility and parameterization are required
  • Teams need to schedule, monitor, and alert on pipeline health

Related Pages

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment