Implementation:Microsoft Autogen GraphFlow Init
| Knowledge Sources | |
|---|---|
| Domains | Multi-Agent Systems, Workflow Configuration, Team Orchestration, Directed Graph Execution |
| Last Updated | 2026-02-11 00:00 GMT |
Overview
Concrete tool for configuring a graph-directed agent team that uses a directed graph to control execution order, branching, and parallelism, provided by Microsoft AutoGen.
Description
GraphFlow is a team class that extends BaseGroupChat with graph-directed speaker selection. It accepts a list of ChatAgent participants and a validated DiGraph that defines how agents execute. The graph determines agent execution order with support for:
- Sequential execution: A linear chain of agents (A -> B -> C).
- Parallel fan-out: One agent's completion triggers multiple downstream agents simultaneously (A -> B, A -> C).
- Fan-in synchronization: An agent waits for all (or any) upstream agents to complete before executing.
- Conditional branching: Edges are traversed based on message content, directing flow to different agents.
- Cyclic loops: Agents can loop back to earlier nodes, with mandatory exit conditions.
Internally, GraphFlow creates a GraphFlowManager that maintains execution state: a ready queue of nodes, remaining incoming edge counts, and activation group tracking. The manager validates the graph on construction and enforces that cyclic graphs have either a termination condition or a maximum turn limit.
GraphFlow is a Component and supports serialization via dump_component() and deserialization via load_component().
Usage
Use GraphFlow when you need to run a team of agents orchestrated by a directed graph. It is the primary entry point for graph-based multi-agent execution in AutoGen.
Code Reference
Source Location
- Repository: Microsoft AutoGen
- File:
python/packages/autogen-agentchat/src/autogen_agentchat/teams/_group_chat/_graph/_digraph_group_chat.py(Lines 783-815)
Signature
class GraphFlow(BaseGroupChat, Component[GraphFlowConfig]):
def __init__(
self,
participants: List[ChatAgent],
graph: DiGraph,
*,
name: str | None = None,
description: str | None = None,
termination_condition: TerminationCondition | None = None,
max_turns: int | None = None,
runtime: AgentRuntime | None = None,
custom_message_types: List[type[BaseAgentEvent | BaseChatMessage]] | None = None,
) -> None:
Import
from autogen_agentchat.teams import GraphFlow
I/O Contract
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
| participants | List[ChatAgent] | Yes | The agent instances that will execute. Each participant's name must correspond to a node in the graph. Names must be unique. |
| graph | DiGraph | Yes | The validated directed graph defining execution topology, edge conditions, and activation modes. |
| name | str or None | No | Name for the team. Defaults to "GraphFlow". |
| description | str or None | No | Description of the team. Defaults to "A team of agents". |
| termination_condition | TerminationCondition or None | No | Condition to stop execution (e.g., MaxMessageTermination, TextMentionTermination). Required for cyclic graphs if max_turns is not set. |
| max_turns | int or None | No | Maximum number of agent turns before forcing termination. Required for cyclic graphs if termination_condition is not set. |
| runtime | AgentRuntime or None | No | The agent runtime to use. If None, an embedded SingleThreadedAgentRuntime is created automatically. |
| custom_message_types | List[type] or None | No | Additional message types to register with the message factory beyond the defaults. |
Outputs
| Name | Type | Description |
|---|---|---|
| GraphFlow | GraphFlow | A configured team instance ready for execution via run() or run_stream(). Implements the Team protocol with run, run_stream, reset, save_state, and load_state methods. |
Usage Examples
Basic Example: Sequential Flow
import asyncio
from autogen_agentchat.agents import AssistantAgent
from autogen_agentchat.conditions import MaxMessageTermination
from autogen_agentchat.teams import DiGraphBuilder, GraphFlow
from autogen_ext.models.openai import OpenAIChatCompletionClient
async def main():
model_client = OpenAIChatCompletionClient(model="gpt-4.1-nano")
agent_a = AssistantAgent("A", model_client=model_client,
system_message="You are a helpful assistant.")
agent_b = AssistantAgent("B", model_client=model_client,
system_message="Translate input to Chinese.")
agent_c = AssistantAgent("C", model_client=model_client,
system_message="Translate input to English.")
# Build directed graph: A -> B -> C
builder = DiGraphBuilder()
builder.add_node(agent_a).add_node(agent_b).add_node(agent_c)
builder.add_edge(agent_a, agent_b).add_edge(agent_b, agent_c)
graph = builder.build()
# Configure the GraphFlow team
team = GraphFlow(
participants=builder.get_participants(),
graph=graph,
termination_condition=MaxMessageTermination(5),
)
result = await team.run(task="Write a short story about a cat.")
print(result)
asyncio.run(main())
Cyclic Loop with Termination
import asyncio
from autogen_agentchat.agents import AssistantAgent
from autogen_agentchat.conditions import MaxMessageTermination
from autogen_agentchat.teams import DiGraphBuilder, GraphFlow
from autogen_ext.models.openai import OpenAIChatCompletionClient
async def main():
model_client = OpenAIChatCompletionClient(model="gpt-4.1")
writer = AssistantAgent("writer", model_client=model_client,
system_message="Write content based on feedback.")
reviewer = AssistantAgent("reviewer", model_client=model_client,
system_message="Review. Say APPROVE if good, else give feedback.")
publisher = AssistantAgent("publisher", model_client=model_client,
system_message="Publish the approved content.")
builder = DiGraphBuilder()
builder.add_node(writer).add_node(reviewer).add_node(publisher)
builder.add_edge(writer, reviewer)
builder.add_edge(reviewer, publisher,
condition=lambda msg: "APPROVE" in msg.to_model_text())
builder.add_edge(reviewer, writer,
condition=lambda msg: "APPROVE" not in msg.to_model_text())
builder.set_entry_point(writer)
graph = builder.build()
# Cyclic graph requires termination_condition or max_turns
team = GraphFlow(
participants=builder.get_participants(),
graph=graph,
termination_condition=MaxMessageTermination(20),
)
result = await team.run(task="Write a poem about AI agents.")
print(result.stop_reason)
asyncio.run(main())