Principle:Anthropics Anthropic sdk python Tool Definition
| Knowledge Sources | |
|---|---|
| Domains | Tool_Use, LLM, Function_Calling |
| Last Updated | 2026-02-15 00:00 GMT |
Overview
Tool Definition is the foundational step in the Tool Use Integration workflow. It describes how Python functions are transformed into structured tool schemas that Claude can understand and invoke. The Anthropic Python SDK provides a decorator-based pattern that converts ordinary Python functions into tool objects with automatically inferred JSON Schema definitions, enabling a declarative approach to tool registration without manual schema authoring.
Core Concept: Function-to-Tool Schema Conversion
At the heart of tool use is the translation between two worlds: the Python function signature (with type annotations, default values, and docstrings) and the JSON Schema that the Claude API requires to understand what tools are available.
A tool definition sent to the API has three essential components:
- name -- A unique identifier the model uses to reference the tool
- description -- Natural language explaining what the tool does and when to use it
- input_schema -- A JSON Schema object describing the parameters the tool accepts
The SDK automates all three:
| Python Source | JSON Schema Field | Inference Rule |
|---|---|---|
func.__name__ |
name |
Direct mapping (snake_case) |
| Docstring short + long description | description |
Extracted via docstring_parser
|
| Type annotations on parameters | input_schema.properties |
Converted via Pydantic TypeAdapter
|
| Parameters without defaults | input_schema.required |
Parameters lacking default values become required |
Docstring Args: section |
properties.*.description |
Matched by parameter name |
Automatic JSON Schema Inference
The SDK uses Pydantic v2's TypeAdapter to convert the function's call signature into a JSON Schema. This leverages Pydantic's mature type-to-schema system, supporting:
- Primitive types:
str,int,float,bool - Optional types:
Optional[str]becomes a nullable field - Enum types:
Literal["celsius", "fahrenheit"]becomes an enum constraint - Complex types:
list[str],dict[str, int], nested Pydantic models - Default values: Parameters with defaults are excluded from the
requiredarray
The inference pipeline is:
- The function is wrapped with
pydantic.validate_call - A
TypeAdapteris constructed from the validated wrapper TypeAdapter.json_schema()generates the JSON Schema- A custom
GenerateJsonSchemasubclass enriches property descriptions from docstring parameters
Decorator Pattern for Declarative Registration
The SDK provides the @beta_tool decorator (and @beta_async_tool for async functions) that follows the standard Python decorator pattern with optional arguments:
# Bare decorator -- all metadata inferred
@beta_tool
def get_weather(city: str, unit: str = "celsius") -> str:
"""Get the current weather for a city.
Args:
city: The city name
unit: Temperature unit (celsius or fahrenheit)
"""
return f"The weather in {city} is 22 degrees"
# Decorator with overrides
@beta_tool(name="weather_lookup", description="Fetch current weather data")
def get_weather(city: str) -> str:
...
This works through Python's overload pattern: when beta_tool receives a callable as its first argument, it wraps it immediately; when called with keyword arguments only, it returns a decorator function that performs the wrapping.
The ToolParam Output Format
The .to_dict() method on the resulting BetaFunctionTool object produces a BetaToolParam (a TypedDict) with the following structure:
{
"name": "get_weather",
"description": "Get the current weather for a city.",
"input_schema": {
"type": "object",
"properties": {
"city": {"type": "string", "description": "The city name"},
"unit": {"type": "string", "default": "celsius", "description": "Temperature unit (celsius or fahrenheit)"}
},
"required": ["city"]
}
}
This dict can be passed directly to client.messages.create(tools=[...]).
Design Rationale
The decorator pattern was chosen because:
- Minimal boilerplate: Developers annotate functions they already have, rather than writing parallel schema definitions
- Single source of truth: The function signature is the schema; no drift between code and API contract
- Composability:
BetaFunctionToolobjects can be collected into lists, passed to runners, or serialized independently - Type safety: Pydantic validates inputs at runtime, catching schema violations before tool execution