Overview
The Anthropic Messages Helper provides bidirectional conversion between Anthropic SDK message formats and Phoenix prompt version data structures.
Description
This module implements the conversion layer between Anthropic's Messages API parameters (as defined by the anthropic Python SDK) and the Phoenix prompt version format (v1.PromptVersionData). It exposes two primary public functions: create_prompt_version_from_anthropic() converts an Anthropic MessageCreateParamsBase object into a Phoenix PromptVersionData structure, while to_chat_messages_and_kwargs() performs the reverse transformation, converting a Phoenix prompt version back into a list of Anthropic MessageParam objects along with an AnthropicMessageModelKwargs TypedDict suitable for passing directly to Anthropic().messages.create().
The module handles conversion of messages (including system, user, assistant, and tool roles), content blocks (text, tool use, tool result), invocation parameters (max_tokens, temperature, top_p, stop_sequences, thinking configuration), tool definitions and tool choice settings, and supports template variable substitution via the TemplateFormatter system. It supports cross-provider parameter mapping, allowing Phoenix prompt versions originally created with OpenAI, Google, Azure, or AWS parameters to be converted to Anthropic-compatible invocation parameters.
Usage
Use create_prompt_version_from_anthropic() when you want to capture an existing Anthropic API call configuration as a reusable Phoenix prompt version. Use to_chat_messages_and_kwargs() when you want to render a Phoenix prompt version into parameters that can be passed directly to the Anthropic Messages API, optionally substituting template variables at call time.
Code Reference
Source Location
Signature
def create_prompt_version_from_anthropic(
obj: MessageCreateParamsBase,
/,
*,
description: Optional[str] = None,
template_format: Literal["F_STRING", "MUSTACHE", "NONE"] = "MUSTACHE",
model_provider: Literal["ANTHROPIC"] = "ANTHROPIC",
) -> v1.PromptVersionData: ...
def to_chat_messages_and_kwargs(
obj: v1.PromptVersionData,
/,
*,
variables: Mapping[str, str] = MappingProxyType({}),
formatter: Optional[TemplateFormatter] = None,
) -> tuple[list[MessageParam], AnthropicMessageModelKwargs]: ...
class AnthropicMessageModelKwargs(TypedDict, total=False):
model: Required[ModelParam]
system: Union[str, list[TextBlockParam]]
max_tokens: Required[int]
stop_sequences: list[str]
temperature: float
top_p: float
thinking: Union[ThinkingConfigEnabledParam, ThinkingConfigDisabledParam]
tools: list[ToolParam]
tool_choice: ToolChoiceParam
Import
from phoenix.client.helpers.sdk.anthropic.messages import (
create_prompt_version_from_anthropic,
to_chat_messages_and_kwargs,
)
I/O Contract
create_prompt_version_from_anthropic()
Inputs
| Name |
Type |
Required |
Description
|
| obj |
MessageCreateParamsBase |
Yes |
Anthropic message creation parameters containing model, messages, max_tokens, and optional tools/tool_choice
|
| description |
Optional[str] |
No |
Optional description to attach to the prompt version
|
| template_format |
Literal["F_STRING", "MUSTACHE", "NONE"] |
No |
Template format for variable substitution (default: "MUSTACHE")
|
| model_provider |
Literal["ANTHROPIC"] |
No |
Model provider identifier (default: "ANTHROPIC")
|
Outputs
| Name |
Type |
Description
|
| return |
v1.PromptVersionData |
Phoenix prompt version data containing template, model configuration, invocation parameters, and optional tool definitions
|
to_chat_messages_and_kwargs()
Inputs
| Name |
Type |
Required |
Description
|
| obj |
v1.PromptVersionData |
Yes |
Phoenix prompt version data to convert
|
| variables |
Mapping[str, str] |
No |
Template variables to substitute into message content (default: empty)
|
| formatter |
Optional[TemplateFormatter] |
No |
Custom template formatter; auto-detected from prompt version if not provided
|
Outputs
| Name |
Type |
Description
|
| messages |
list[MessageParam] |
List of Anthropic message parameters ready for the Messages API
|
| kwargs |
AnthropicMessageModelKwargs |
Keyword arguments including model, system prompt, invocation parameters, and tool configuration
|
Usage Examples
from anthropic.types.message_create_params import MessageCreateParamsBase
from phoenix.client.helpers.sdk.anthropic.messages import (
create_prompt_version_from_anthropic,
to_chat_messages_and_kwargs,
)
# Convert Anthropic params to a Phoenix prompt version
params: MessageCreateParamsBase = {
"model": "claude-sonnet-4-20250514",
"max_tokens": 1024,
"temperature": 0.7,
"messages": [
{"role": "user", "content": "Explain {{topic}} in simple terms."}
],
}
prompt_version = create_prompt_version_from_anthropic(params)
# Convert back to Anthropic format with variable substitution
messages, kwargs = to_chat_messages_and_kwargs(
prompt_version,
variables={"topic": "quantum computing"},
)
# Use directly with the Anthropic SDK
from anthropic import Anthropic
client = Anthropic()
response = client.messages.create(messages=messages, **kwargs)
Related Pages