Implementation:Arize ai Phoenix OpenAI Chat Helper
| Knowledge Sources | |
|---|---|
| Domains | AI_Observability, Client_SDK |
| Last Updated | 2026-02-14 05:30 GMT |
Overview
The OpenAI Chat Helper provides bidirectional conversion between OpenAI Chat Completions SDK formats and Phoenix prompt version data structures, with multi-provider support for OpenAI, Azure OpenAI, DeepSeek, xAI, and Ollama.
Description
This module implements the conversion layer between OpenAI's Chat Completions API parameters (as defined by the openai Python SDK's CompletionCreateParamsBase) and the Phoenix prompt version format (v1.PromptVersionData). It is the most comprehensive of the SDK helper modules at 1173 lines, reflecting the breadth of the OpenAI-compatible API ecosystem.
The function create_prompt_version_from_openai() converts an OpenAI CompletionCreateParamsBase object into a Phoenix PromptVersionData structure. The model_provider parameter accepts "OPENAI", "AZURE_OPENAI", "DEEPSEEK", "XAI", or "OLLAMA", enabling a single conversion path for all OpenAI-compatible providers. The function to_chat_messages_and_kwargs() performs the reverse transformation, producing a list of ChatCompletionMessageParam objects and an OpenAIChatCompletionModelKwargs TypedDict suitable for passing to OpenAI().chat.completions.create().
The module handles all message roles (system, developer, user, assistant, tool), content parts (text, tool calls, tool results), invocation parameters (temperature, top_p, max_tokens, max_completion_tokens, frequency_penalty, presence_penalty, seed, reasoning_effort, stop sequences), tool definitions with strict mode support, tool choice (none, auto, required, specific function), parallel tool call settings, and response format (JSON Schema). It supports cross-provider parameter mapping from Anthropic, Google, and AWS invocation parameter formats into OpenAI-compatible parameters.
Usage
Use create_prompt_version_from_openai() when capturing an existing OpenAI (or compatible provider) API call configuration as a reusable Phoenix prompt version. Use to_chat_messages_and_kwargs() when rendering a Phoenix prompt version into parameters for the OpenAI Chat Completions API, with optional template variable substitution.
Code Reference
Source Location
- Repository: Arize_ai_Phoenix
- File: packages/phoenix-client/src/phoenix/client/helpers/sdk/openai/chat.py
- Lines: 1173
Signature
def create_prompt_version_from_openai(
obj: CompletionCreateParamsBase,
/,
*,
description: Optional[str] = None,
template_format: Literal["F_STRING", "MUSTACHE", "NONE"] = "MUSTACHE",
model_provider: Literal["OPENAI", "AZURE_OPENAI", "DEEPSEEK", "XAI", "OLLAMA"] = "OPENAI",
) -> v1.PromptVersionData: ...
def to_chat_messages_and_kwargs(
obj: v1.PromptVersionData,
/,
*,
variables: Mapping[str, str] = MappingProxyType({}),
formatter: Optional[TemplateFormatter] = None,
) -> tuple[list[ChatCompletionMessageParam], OpenAIChatCompletionModelKwargs]: ...
class OpenAIChatCompletionModelKwargs(TypedDict, total=False):
model: Required[str]
response_format: ResponseFormat
frequency_penalty: float
max_completion_tokens: int
max_tokens: int
presence_penalty: float
reasoning_effort: ChatCompletionReasoningEffort
seed: int
stop: list[str]
temperature: float
top_logprobs: int
top_p: float
parallel_tool_calls: bool
tool_choice: ChatCompletionToolChoiceOptionParam
tools: Sequence[ChatCompletionToolUnionParam]
Import
from phoenix.client.helpers.sdk.openai.chat import (
create_prompt_version_from_openai,
to_chat_messages_and_kwargs,
)
I/O Contract
create_prompt_version_from_openai()
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
| obj | CompletionCreateParamsBase |
Yes | OpenAI chat completion parameters containing model, messages, and optional settings |
| description | Optional[str] |
No | Optional description to attach to the prompt version |
| template_format | Literal["F_STRING", "MUSTACHE", "NONE"] |
No | Template format for variable substitution (default: "MUSTACHE") |
| model_provider | Literal["OPENAI", "AZURE_OPENAI", "DEEPSEEK", "XAI", "OLLAMA"] |
No | Provider identifier determining parameter storage format (default: "OPENAI") |
Outputs
| Name | Type | Description |
|---|---|---|
| return | v1.PromptVersionData |
Phoenix prompt version data with template, model config, invocation parameters, tools, and response format |
to_chat_messages_and_kwargs()
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
| obj | v1.PromptVersionData |
Yes | Phoenix prompt version data to convert |
| variables | Mapping[str, str] |
No | Template variables to substitute into message content (default: empty) |
| formatter | Optional[TemplateFormatter] |
No | Custom template formatter; auto-detected from prompt version if not provided |
Outputs
| Name | Type | Description |
|---|---|---|
| messages | list[ChatCompletionMessageParam] |
List of OpenAI chat completion message parameters |
| kwargs | OpenAIChatCompletionModelKwargs |
Keyword arguments including model, invocation parameters, tools, tool_choice, and response_format |
Usage Examples
from openai.types.chat.completion_create_params import CompletionCreateParamsBase
from phoenix.client.helpers.sdk.openai.chat import (
create_prompt_version_from_openai,
to_chat_messages_and_kwargs,
)
# Convert OpenAI params to a Phoenix prompt version
params: CompletionCreateParamsBase = {
"model": "gpt-4o",
"temperature": 0.7,
"max_completion_tokens": 1024,
"messages": [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Explain {{topic}} in simple terms."},
],
"tools": [
{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get current weather",
"parameters": {"type": "object", "properties": {"city": {"type": "string"}}},
},
}
],
}
prompt_version = create_prompt_version_from_openai(params)
# Convert back with variable substitution
messages, kwargs = to_chat_messages_and_kwargs(
prompt_version,
variables={"topic": "neural networks"},
)
# Use directly with the OpenAI SDK
from openai import OpenAI
client = OpenAI()
response = client.chat.completions.create(messages=messages, **kwargs)
# Also works with Azure OpenAI
prompt_version_azure = create_prompt_version_from_openai(
params, model_provider="AZURE_OPENAI"
)