Overview
The Google GenAI Helper provides conversion between Google Generative AI SDK formats and Phoenix prompt version data structures.
Description
This module implements the conversion layer between Google's Generative AI SDK types (protobuf-based protos.Content, GenerationConfig, ToolConfig) and the Phoenix prompt version format (v1.PromptVersionData). It exposes two primary public functions: create_prompt_version_from_google() (currently a placeholder raising NotImplementedError) is intended to convert Google GenAI parameters into a Phoenix prompt version, while to_chat_messages_and_kwargs() converts a Phoenix prompt version into a list of Google protos.Content messages and a GoogleModelKwargs TypedDict suitable for constructing a GenerativeModel.
The module handles conversion of messages with role mapping (user, assistant/model, system), text content parts via protobuf Part objects, generation configuration (temperature, max_output_tokens, stop_sequences, presence_penalty, frequency_penalty, top_p, top_k), tool definitions via FunctionDeclaration conversion, tool configuration with function calling mode mapping (NONE, AUTO, ANY, specific function), and JSON schema conversion between Phoenix dictionary format and Google's protobuf Schema type. System messages are extracted and passed via the system_instruction keyword argument rather than as content messages.
Usage
Use to_chat_messages_and_kwargs() when you want to render a Phoenix prompt version into parameters compatible with the Google Generative AI SDK, optionally substituting template variables. The returned GoogleModelKwargs can be passed directly to GenerativeModel() constructor, and the messages list can be used with the model's chat interface.
Code Reference
Source Location
Signature
def create_prompt_version_from_google(
obj: Any,
/,
*,
description: Optional[str] = None,
template_format: Literal["F_STRING", "MUSTACHE", "NONE"] = "MUSTACHE",
model_provider: Literal["GOOGLE"] = "GOOGLE",
) -> v1.PromptVersionData: ... # raises NotImplementedError
def to_chat_messages_and_kwargs(
obj: v1.PromptVersionData,
/,
*,
variables: Mapping[str, str] = MappingProxyType({}),
formatter: Optional[TemplateFormatter] = None,
) -> tuple[list[protos.Content], GoogleModelKwargs]: ...
class GoogleModelKwargs(TypedDict, total=False):
model_name: Required[str]
generation_config: GenerationConfig
system_instruction: str | list[str]
tool_config: protos.ToolConfig
tools: list[content_types.Tool]
Import
from phoenix.client.helpers.sdk.google_generativeai.generate_content import (
create_prompt_version_from_google,
to_chat_messages_and_kwargs,
)
I/O Contract
to_chat_messages_and_kwargs()
Inputs
| Name |
Type |
Required |
Description
|
| obj |
v1.PromptVersionData |
Yes |
Phoenix prompt version data to convert
|
| variables |
Mapping[str, str] |
No |
Template variables to substitute into message content (default: empty)
|
| formatter |
Optional[TemplateFormatter] |
No |
Custom template formatter; auto-detected from prompt version if not provided
|
Outputs
| Name |
Type |
Description
|
| messages |
list[protos.Content] |
List of Google protobuf Content objects representing the conversation
|
| kwargs |
GoogleModelKwargs |
Keyword arguments including model_name, generation_config, optional system_instruction, and tool configuration
|
create_prompt_version_from_google()
Inputs
| Name |
Type |
Required |
Description
|
| obj |
Any |
Yes |
Google GenAI parameters (not yet implemented)
|
| description |
Optional[str] |
No |
Optional description for the prompt version
|
| template_format |
Literal["F_STRING", "MUSTACHE", "NONE"] |
No |
Template format (default: "MUSTACHE")
|
| model_provider |
Literal["GOOGLE"] |
No |
Model provider identifier (default: "GOOGLE")
|
Outputs
| Name |
Type |
Description
|
| return |
v1.PromptVersionData |
Phoenix prompt version data (currently raises NotImplementedError)
|
Usage Examples
from phoenix.client.helpers.sdk.google_generativeai.generate_content import (
to_chat_messages_and_kwargs,
)
# Given a Phoenix prompt version (e.g., fetched from the server)
prompt_version = client.prompts.get(name="my-prompt")
# Convert to Google GenAI format with variable substitution
messages, kwargs = to_chat_messages_and_kwargs(
prompt_version,
variables={"topic": "machine learning"},
)
# Use with Google Generative AI SDK
import google.generativeai as genai
model = genai.GenerativeModel(**kwargs)
chat = model.start_chat(history=messages[:-1])
response = chat.send_message(messages[-1].parts)
Related Pages