Principle:Langgenius Dify Workflow Execution
| Knowledge Sources | Dify |
|---|---|
| Domains | Workflow, DAG, Frontend |
| Last Updated | 2026-02-12 00:00 GMT |
Overview
Description
Workflow Execution and Debugging is the principle that governs how DAG workflows are run and monitored in real time within the Dify platform. Workflow execution uses Server-Sent Events (SSE) to stream granular progress updates from the backend to the frontend as each node in the graph starts, processes, and completes. This streaming architecture enables a live debugging experience where the developer can observe the workflow's execution path through the DAG in real time.
The execution lifecycle is managed through three key operations:
- Fetch Workflow Draft: Load the current draft state of the workflow graph before execution begins. This ensures the UI has the latest node configurations and variable wiring.
- Stream Execution: Initiate the workflow run and consume an SSE event stream that delivers typed events for each phase of execution -- from workflow start to individual node transitions to final completion.
- Stop Execution: Abort a running workflow by sending a stop signal to the backend, which terminates processing and emits a workflow_finished event.
The SSE event model provides fine-grained observability into the DAG execution:
- Workflow-level events:
workflow_started,workflow_finished,workflow_paused - Node-level events:
node_started,node_finished,node_retry - Container-level events:
iteration_started,iteration_next,iteration_completed,loop_started,loop_next,loop_completed - Parallel execution events:
parallel_branch_started,parallel_branch_finished - Output events:
text_chunk,text_replace,message,agent_message - Human-in-the-loop events:
human_input_required,human_input_form_filled,human_input_form_timeout - Data source events:
datasource_processing,datasource_completed,datasource_error
Usage
When executing and debugging a workflow:
- Call fetchWorkflowDraft to load the current draft graph state before initiating a run.
- Start the workflow execution (via a POST endpoint) and consume the response as an SSE stream using the handleStream function.
- Register callback handlers for the SSE events you need to observe (e.g.,
onNodeStartedto highlight active nodes,onNodeFinishedto display outputs). - Call stopWorkflowRun to abort a running workflow when the user requests cancellation.
- The handleStream function reads the SSE response body incrementally, parses
data:lines as JSON, dispatches to the appropriate typed callback based on theeventfield, and callsonCompletedwhen the stream ends.
Theoretical Basis
This principle is grounded in:
- Server-Sent Events (SSE) Protocol: SSE provides a unidirectional streaming channel from server to client over HTTP, which is well-suited for long-running pipeline executions where the client needs real-time progress updates but does not need to send data back during execution. SSE is simpler than WebSockets for this use case and works naturally with HTTP infrastructure.
- Event-Driven Architecture: Each SSE event represents a discrete state transition in the workflow's execution. The frontend subscribes to these events and updates its visualization accordingly, following the Observer pattern. This decouples the execution engine from the presentation layer.
- Incremental Streaming Parser: The handleStream implementation reads chunks from the response body's ReadableStream, buffers partial lines, and parses complete
data:lines as JSON objects. This handles network fragmentation gracefully and avoids blocking on large payloads. - Graceful Cancellation: The stopWorkflowRun function implements the cooperative cancellation pattern, where the client signals a stop and the backend gracefully terminates processing, ensuring resources are cleaned up and partial results are available.