Principle:Langchain ai Langgraph Interrupt Stream Execution
| Property | Value |
|---|---|
| Concept | Running a graph until it hits an interrupt point and surfacing the interrupt data |
| Workflow | Human_in_the_Loop_Agent |
| Pipeline Stage | Execution / Streaming |
| Repository | Langchain_ai_Langgraph |
| Source | libs/langgraph/langgraph/pregel/main.py:L2407-2506, libs/langgraph/langgraph/errors.py:L84-90
|
Overview
When a compiled graph with interrupt points is executed via stream() or invoke(), the Pregel execution loop runs super-steps until it encounters an interrupt condition. At that point, the loop saves the current state to the checkpointer and yields an __interrupt__ stream event containing the interrupt data. The graph then halts, waiting for the client to resume execution with a Command.
Description
The Pregel execution loop processes the graph in a sequence of super-steps. Each super-step identifies runnable tasks (nodes whose trigger channels have been updated), executes them, and applies their writes to the shared state channels. Interrupt handling is woven into this loop at two points:
Interrupt-before handling
Before executing a super-step's tasks, the loop checks whether any of the pending task names appear in the interrupt_before_nodes list. If so:
- The current state is checkpointed.
- The tasks are not executed.
- An
__interrupt__event is emitted to the stream containingInterruptobjects for each pending task. - The loop terminates.
Interrupt-after handling
After executing a super-step's tasks and applying their writes, the loop checks whether any of the completed task names appear in the interrupt_after_nodes list. If so:
- The current state (with the node's writes applied) is checkpointed.
- An
__interrupt__event is emitted. - The loop terminates before proceeding to the next super-step.
In-node GraphInterrupt
When a node calls interrupt(value), it raises a GraphInterrupt exception. The Pregel loop catches this exception during task execution:
- The
GraphInterruptexception contains a tuple ofInterruptobjects. - These interrupts are recorded as pending writes on the checkpoint.
- The loop emits an
__interrupt__stream event with the interrupt data. - Execution halts.
The __interrupt__ stream event
Regardless of which mechanism triggered the interrupt, the client receives the interrupt data via the __interrupt__ key in the stream output. The value is a tuple of Interrupt objects, each with a value (the data surfaced to the human) and an id (for matching resume values).
Usage
Stream execution with interrupts is the standard way to run a human-in-the-loop graph:
- Call
graph.stream(input, config)with athread_idin the config. - Iterate over stream events. Normal node outputs appear as
{node_name: output}. - When an interrupt occurs, the stream yields
{"__interrupt__": (Interrupt(...), ...)}. - The stream ends. The graph is now paused.
- The client inspects the interrupt data, obtains human input, and resumes via
graph.stream(Command(resume=value), config).
Theoretical Basis
The stream execution with interrupts implements a cooperative scheduling model within the Pregel framework:
- Interrupt event detection: The execution loop acts as a scheduler that checks interrupt conditions at well-defined points (before/after each super-step). This ensures that interrupts occur at consistent, predictable boundaries.
- GraphInterrupt exception as signal: The
GraphInterruptexception serves as a structured signal from a node to the execution loop. It is a subclass ofGraphBubbleUp, which is caught and handled by the Pregel loop rather than propagating to the user. The docstring explicitly states: "Never raised directly, or surfaced to the user." - Checkpoint-as-savepoint: Each interrupt creates a checkpoint that serves as a savepoint. The graph can be resumed from this savepoint, rewound to a prior savepoint, or branched from it. This is analogous to database savepoints in transaction management.
- Stream as communication channel: The
__interrupt__stream event is the primary communication mechanism between the graph execution and the calling client. It carries the interrupt payload out of the execution context to where a human can interact with it.