Implementation:Googleapis Python genai GenerateContentConfig Setup
| Knowledge Sources | |
|---|---|
| Domains | NLP, Decoding_Strategies |
| Last Updated | 2026-02-15 00:00 GMT |
Overview
Concrete tool for specifying generation parameters and sampling strategies provided by the google-genai types module.
Description
GenerateContentConfig is a Pydantic model that holds all optional configuration for content generation requests. It controls sampling parameters (temperature, top_p, top_k), output format (response_mime_type, response_schema), safety settings, system instructions, tool configurations, and caching references. It is passed as the config parameter to models.generate_content and models.generate_content_stream.
Usage
Create a GenerateContentConfig when you need to customize generation behavior. Pass it as a dict or Pydantic model to the config parameter. For default behavior, omit the config entirely. Common use cases include setting a system instruction, adjusting temperature, or enabling JSON output mode.
Code Reference
Source Location
- Repository: googleapis/python-genai
- File: google/genai/types.py
- Lines: L5077
Signature
class GenerateContentConfig(_common.BaseModel):
"""Optional model configuration parameters."""
http_options: Optional[HttpOptions] = None
system_instruction: Optional[ContentUnion] = None
temperature: Optional[float] = None
top_p: Optional[float] = None
top_k: Optional[float] = None
candidate_count: Optional[int] = None
max_output_tokens: Optional[int] = None
stop_sequences: Optional[list[str]] = None
response_mime_type: Optional[str] = None
response_schema: Optional[SchemaUnion] = None
response_modalities: Optional[list[str]] = None
safety_settings: Optional[list[SafetySetting]] = None
tools: Optional[list[ToolUnion]] = None
tool_config: Optional[ToolConfig] = None
cached_content: Optional[str] = None
response_logprobs: Optional[bool] = None
logprobs: Optional[int] = None
presence_penalty: Optional[float] = None
frequency_penalty: Optional[float] = None
seed: Optional[int] = None
automatic_function_calling_config: Optional[AutomaticFunctionCallingConfig] = None
Import
from google.genai import types
I/O Contract
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
| temperature | Optional[float] | No | Sampling temperature (0.0 = deterministic, higher = more random) |
| top_p | Optional[float] | No | Nucleus sampling threshold |
| top_k | Optional[float] | No | Top-k token count |
| candidate_count | Optional[int] | No | Number of response candidates to generate |
| max_output_tokens | Optional[int] | No | Maximum tokens in the response |
| system_instruction | Optional[ContentUnion] | No | System-level behavioral instruction |
| response_mime_type | Optional[str] | No | Output MIME type (e.g., 'application/json') |
| response_schema | Optional[SchemaUnion] | No | JSON schema for structured output |
| tools | Optional[list[ToolUnion]] | No | Tool/function definitions |
| cached_content | Optional[str] | No | Cache resource name for context caching |
| safety_settings | Optional[list[SafetySetting]] | No | Safety filter configurations |
| seed | Optional[int] | No | Random seed for reproducibility |
Outputs
| Name | Type | Description |
|---|---|---|
| GenerateContentConfig | GenerateContentConfig | Configuration object to pass as config to generate_content |
Usage Examples
Basic Configuration
from google import genai
from google.genai import types
client = genai.Client(api_key="YOUR_API_KEY")
response = client.models.generate_content(
model="gemini-2.0-flash",
contents="Write a haiku about programming.",
config=types.GenerateContentConfig(
temperature=0.9,
max_output_tokens=100,
system_instruction="You are a creative poet.",
),
)
print(response.text)
JSON Output Mode
from google import genai
from google.genai import types
client = genai.Client(api_key="YOUR_API_KEY")
response = client.models.generate_content(
model="gemini-2.0-flash",
contents="List 3 programming languages with their creators.",
config=types.GenerateContentConfig(
response_mime_type="application/json",
temperature=0.0,
),
)
print(response.text) # Returns JSON string