Implementation:Microsoft Autogen RoundRobinGroupChat Init
| Knowledge Sources | |
|---|---|
| Domains | Multi-Agent Systems, Orchestration, Turn-Taking, AI Agents |
| Last Updated | 2026-02-11 00:00 GMT |
Overview
Concrete tool for assembling agents into a deterministic sequential turn-taking team provided by Microsoft AutoGen.
Description
RoundRobinGroupChat is the simplest team orchestration class in AutoGen. It takes a list of participants (agents or nested teams) and cycles through them in order, giving each one a turn to process the conversation and produce a response. The conversation continues until a termination condition is met or the maximum number of turns is reached.
Internally, RoundRobinGroupChat creates a RoundRobinGroupChatManager that handles the turn-taking logic. The manager uses a simple index-based rotation to select the next speaker. The team is built on AutoGen's runtime system with topic-based pub/sub messaging: each participant subscribes to a group topic, and the manager publishes messages to coordinate the conversation.
Key characteristics:
- Participants can be
ChatAgentinstances or nestedTeaminstances, enabling hierarchical multi-agent workflows. - Termination is controlled by an optional
TerminationCondition(composable with|and&). - max_turns provides a hard upper limit independent of the termination condition.
- emit_team_events controls whether internal team orchestration events are surfaced during streaming.
Usage
Import RoundRobinGroupChat from autogen_agentchat.teams and instantiate it with a list of agent participants and an optional termination condition. Then call run() or run_stream() to execute the conversation.
Code Reference
Source Location
- Repository: Microsoft AutoGen
- File:
python/packages/autogen-agentchat/src/autogen_agentchat/teams/_group_chat/_round_robin_group_chat.py(lines 242-265)
Signature
class RoundRobinGroupChat:
def __init__(
self,
participants: List[ChatAgent | Team],
*,
name: str | None = None,
description: str | None = None,
termination_condition: TerminationCondition | None = None,
max_turns: int | None = None,
runtime: AgentRuntime | None = None,
custom_message_types: List[type[BaseAgentEvent | BaseChatMessage]] | None = None,
emit_team_events: bool = False,
) -> None:
Import
from autogen_agentchat.teams import RoundRobinGroupChat
I/O Contract
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
| participants | List[ChatAgent or Team] | Yes | Ordered list of agents or nested teams. Turn order follows list order, cycling from last back to first. |
| name | str or None | No | Name for this team instance. Defaults to "RoundRobinGroupChat". |
| description | str or None | No | Human-readable description of the team. Defaults to "A team of agents." |
| termination_condition | TerminationCondition or None | No | Composable condition that stops the conversation. If None, conversation runs until max_turns (if set) or indefinitely. |
| max_turns | int or None | No | Hard upper limit on the number of turns. Each agent response counts as one turn. If None, no turn limit is applied. |
| runtime | AgentRuntime or None | No | Custom runtime for agent registration and messaging. If None, a default SingleThreadedAgentRuntime is created. |
| custom_message_types | List[type] or None | No | Additional message types that participants may produce, beyond the standard set. |
| emit_team_events | bool | No | Whether to emit internal team orchestration events in the output stream. Defaults to False. |
Outputs
| Name | Type | Description |
|---|---|---|
| instance | RoundRobinGroupChat | A configured team instance. Call run() for a final TaskResult or run_stream() for an async stream of messages and the final TaskResult. |
Usage Examples
Basic Example
import asyncio
from autogen_agentchat.agents import AssistantAgent
from autogen_agentchat.conditions import MaxMessageTermination
from autogen_agentchat.teams import RoundRobinGroupChat
from autogen_ext.models.openai import OpenAIChatCompletionClient
async def main():
model_client = OpenAIChatCompletionClient(model="gpt-4o")
writer = AssistantAgent(
name="writer",
model_client=model_client,
system_message="You are a creative writer. Write content based on the task.",
)
reviewer = AssistantAgent(
name="reviewer",
model_client=model_client,
system_message="You are an editor. Review the writing and suggest improvements. Say APPROVE when satisfied.",
)
termination = MaxMessageTermination(max_messages=6)
team = RoundRobinGroupChat(
participants=[writer, reviewer],
termination_condition=termination,
)
result = await team.run(task="Write a haiku about autumn.")
print(result)
asyncio.run(main())
With Nested Teams
import asyncio
from autogen_agentchat.agents import AssistantAgent
from autogen_agentchat.conditions import MaxMessageTermination, TextMentionTermination
from autogen_agentchat.teams import RoundRobinGroupChat
from autogen_ext.models.openai import OpenAIChatCompletionClient
async def main():
model_client = OpenAIChatCompletionClient(model="gpt-4o")
# Inner team: research sub-team
researcher = AssistantAgent(name="researcher", model_client=model_client)
fact_checker = AssistantAgent(name="fact_checker", model_client=model_client)
research_team = RoundRobinGroupChat(
participants=[researcher, fact_checker],
termination_condition=MaxMessageTermination(4),
)
# Outer team: research team feeds into a writer
writer = AssistantAgent(name="writer", model_client=model_client)
outer_team = RoundRobinGroupChat(
participants=[research_team, writer],
termination_condition=TextMentionTermination("TERMINATE"),
max_turns=6,
)
result = await outer_team.run(task="Write a brief report on renewable energy.")
print(result)
asyncio.run(main())