Principle:CrewAIInc CrewAI Built In Tool Selection
Overview
A curated library of pre-built tools organized by domain (search, scraping, file I/O, databases, AI services) that can be directly assigned to agents without custom implementation.
Description
Built-In Tool Selection provides 60+ ready-to-use tools covering common agent needs across multiple domains. Rather than implementing common capabilities from scratch, developers can import and configure pre-built tools that are maintained as part of the CrewAI ecosystem.
The library is organized into the following categories:
- Web Search: Tools that query search engines and return results. Examples include SerperDevTool, TavilySearchTool, BraveSearchTool, and EXASearchTool. Each wraps a different search provider's API.
- Web Scraping: Tools that extract content from web pages. Options range from simple HTTP-based scraping (ScrapeWebsiteTool) to browser-automation-based approaches (SeleniumScrapingTool, StagehandTool) and managed scraping services (FirecrawlScrapeWebsiteTool).
- File Operations: Tools for reading, writing, and searching files. Includes FileReadTool, FileWriterTool, DirectoryReadTool, PDFSearchTool, and DOCXSearchTool.
- Databases: Tools that query relational and analytical databases. Covers MySQLSearchTool, PGSearchTool, and SnowflakeSearchTool.
- AI Services: Tools that invoke external AI models for generation tasks. Includes DALLETool for image generation and VisionTool for image analysis.
- RAG (Retrieval-Augmented Generation): Tools that combine vector search with document content. Includes CSVSearchTool, JSONSearchTool, MDXSearchTool, and others that enable semantic search over structured and unstructured data.
- Code Execution: Tools like CodeInterpreterTool that allow agents to write and execute code.
Each tool extends BaseTool and requires tool-specific configuration, typically in the form of API keys (set via environment variables) or connection strings (passed as constructor arguments).
Key Considerations
- API key management: Most search and scraping tools require API keys. These should be configured via environment variables (e.g.,
SERPER_API_KEY,OPENAI_API_KEY) rather than hardcoded. - Cost awareness: Many built-in tools invoke paid APIs. Consider setting
max_usage_counton tools to limit invocation frequency and control costs. - Tool selection for agents: Assign only the tools an agent needs. Providing too many tools increases the LLM's decision space and can degrade tool-selection accuracy.
- Fallback strategies: For critical capabilities like web search, consider configuring multiple tools from different providers so the agent can fall back if one service is unavailable.
- Installation: Built-in tools are distributed in the
crewai-toolspackage, which is separate from the corecrewaipackage. Install viapip install crewai-toolsorpip install crewai[tools].
Theoretical Basis
This principle follows the Standard Library pattern where commonly needed capabilities are pre-built and distributed with the framework. Just as programming languages ship standard libraries for file I/O, networking, and data structures, CrewAI ships a tools library for common agent capabilities. This eliminates duplicated effort, ensures consistent quality, and reduces the barrier to building capable agents.
Relationship to Implementation
Implementation:CrewAIInc_CrewAI_CrewAI_Tools_Library
The crewai-tools package provides the concrete implementations of all built-in tools, organized as importable Python classes.
See Also
- Principle:CrewAIInc_CrewAI_Tool_Design -- The interface specification all built-in tools conform to
- Principle:CrewAIInc_CrewAI_Tool_Implementation -- The patterns used to implement built-in tools
- Principle:CrewAIInc_CrewAI_Tool_Assignment -- How built-in tools are assigned to agents and tasks