Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Principle:CrewAIInc CrewAI Crew Integration In Flow

From Leeroopedia

Overview

Crew Integration in Flow is a composition pattern for embedding crew executions within flow methods, passing flow state as crew inputs and storing crew results back into state, thereby bridging the crew execution model with the event-driven flow execution model.

Description

CrewAI provides two execution models that serve different purposes:

  • Crew Execution -- A pipeline model where agents collaborate on tasks in sequential or hierarchical order, producing a CrewOutput containing raw text, structured Pydantic objects, or JSON.
  • Flow Execution -- An event-driven model where decorated methods form a directed graph, with state shared across methods and execution propagated through listeners and routers.

Crew Integration in Flow bridges these two models. Within a flow method (decorated with @start, @listen, or @router), developers can:

  1. Read data from self.state to construct crew inputs
  2. Instantiate a Crew with agents and tasks
  3. Call crew.kickoff(inputs={...}) to execute the crew pipeline
  4. Extract results from CrewOutput (via .raw, .pydantic, or .json_dict)
  5. Store results back into self.state for downstream listeners

This pattern allows a single flow to orchestrate multiple crews, each handling a specialized sub-task, with custom Python logic between crew executions to transform data, make routing decisions, or aggregate results.

Theoretical Basis

Crew Integration in Flow implements the Bridge Pattern from software design, connecting two distinct execution models:

Aspect Crew Model Flow Model
Execution style Imperative pipeline (sequential/hierarchical) Event-driven DAG traversal
Data passing inputs dict to kickoff(), CrewOutput returned Shared self.state object across methods
Concurrency Single crew runs tasks in order Multiple listeners can execute in parallel
Composition unit Agent + Task Decorated method
Output type CrewOutput (raw, pydantic, json_dict) Method return value (any Python object)

The Bridge Pattern decouples the two execution models so they can evolve independently. A crew knows nothing about the flow that invoked it; a flow method simply treats crew execution as a function call.

This composition also enables the Saga Pattern from distributed systems: each crew execution is a compensatable step in a larger workflow, and the flow can implement compensation logic (e.g., retry, rollback) based on crew results stored in state.

Usage

When to Use Crew Integration

  • When a flow step requires LLM-powered agent collaboration (research, writing, analysis)
  • When different steps need different agent configurations or tools
  • When crew results need post-processing before feeding into the next step
  • When multiple independent crews should run in parallel within a flow

Integration Pattern

  1. Prepare inputs from self.state as a dict
  2. Create the Crew with appropriate agents and tasks (inline or via YAML config)
  3. Call crew.kickoff(inputs=...) to execute
  4. Extract results from CrewOutput.raw (string), .pydantic (typed object), or .json_dict (dict)
  5. Store results back to self.state for downstream methods

Constraints

  • Crew execution is synchronous within a flow method (the method blocks until the crew completes)
  • The flow's state proxy provides thread safety, but the crew itself does not access self.state directly; data must be passed explicitly via the inputs dict
  • Crew errors propagate as exceptions; the flow method should handle or let them propagate to the flow's error handling

Related Pages

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment