Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Principle:Langchain ai Langgraph Interrupt Stream Execution

From Leeroopedia
Property Value
Concept Running a graph until it hits an interrupt point and surfacing the interrupt data
Workflow Human_in_the_Loop_Agent
Pipeline Stage Execution / Streaming
Repository Langchain_ai_Langgraph
Source libs/langgraph/langgraph/pregel/main.py:L2407-2506, libs/langgraph/langgraph/errors.py:L84-90

Overview

When a compiled graph with interrupt points is executed via stream() or invoke(), the Pregel execution loop runs super-steps until it encounters an interrupt condition. At that point, the loop saves the current state to the checkpointer and yields an __interrupt__ stream event containing the interrupt data. The graph then halts, waiting for the client to resume execution with a Command.

Description

The Pregel execution loop processes the graph in a sequence of super-steps. Each super-step identifies runnable tasks (nodes whose trigger channels have been updated), executes them, and applies their writes to the shared state channels. Interrupt handling is woven into this loop at two points:

Interrupt-before handling

Before executing a super-step's tasks, the loop checks whether any of the pending task names appear in the interrupt_before_nodes list. If so:

  1. The current state is checkpointed.
  2. The tasks are not executed.
  3. An __interrupt__ event is emitted to the stream containing Interrupt objects for each pending task.
  4. The loop terminates.

Interrupt-after handling

After executing a super-step's tasks and applying their writes, the loop checks whether any of the completed task names appear in the interrupt_after_nodes list. If so:

  1. The current state (with the node's writes applied) is checkpointed.
  2. An __interrupt__ event is emitted.
  3. The loop terminates before proceeding to the next super-step.

In-node GraphInterrupt

When a node calls interrupt(value), it raises a GraphInterrupt exception. The Pregel loop catches this exception during task execution:

  1. The GraphInterrupt exception contains a tuple of Interrupt objects.
  2. These interrupts are recorded as pending writes on the checkpoint.
  3. The loop emits an __interrupt__ stream event with the interrupt data.
  4. Execution halts.

The __interrupt__ stream event

Regardless of which mechanism triggered the interrupt, the client receives the interrupt data via the __interrupt__ key in the stream output. The value is a tuple of Interrupt objects, each with a value (the data surfaced to the human) and an id (for matching resume values).

Usage

Stream execution with interrupts is the standard way to run a human-in-the-loop graph:

  1. Call graph.stream(input, config) with a thread_id in the config.
  2. Iterate over stream events. Normal node outputs appear as {node_name: output}.
  3. When an interrupt occurs, the stream yields {"__interrupt__": (Interrupt(...), ...)}.
  4. The stream ends. The graph is now paused.
  5. The client inspects the interrupt data, obtains human input, and resumes via graph.stream(Command(resume=value), config).

Theoretical Basis

The stream execution with interrupts implements a cooperative scheduling model within the Pregel framework:

  • Interrupt event detection: The execution loop acts as a scheduler that checks interrupt conditions at well-defined points (before/after each super-step). This ensures that interrupts occur at consistent, predictable boundaries.
  • GraphInterrupt exception as signal: The GraphInterrupt exception serves as a structured signal from a node to the execution loop. It is a subclass of GraphBubbleUp, which is caught and handled by the Pregel loop rather than propagating to the user. The docstring explicitly states: "Never raised directly, or surfaced to the user."
  • Checkpoint-as-savepoint: Each interrupt creates a checkpoint that serves as a savepoint. The graph can be resumed from this savepoint, rewound to a prior savepoint, or branched from it. This is analogous to database savepoints in transaction management.
  • Stream as communication channel: The __interrupt__ stream event is the primary communication mechanism between the graph execution and the calling client. It carries the interrupt payload out of the execution context to where a human can interact with it.

Related Pages

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment