Implementation:Ollama Ollama FromChatRequest
| Knowledge Sources | |
|---|---|
| Domains | API_Design, Data_Transformation |
| Last Updated | 2026-02-14 00:00 GMT |
Overview
Concrete tool for translating OpenAI chat completion requests to Ollama format provided by the openai package.
Description
FromChatRequest converts an OpenAI ChatCompletionRequest into an Ollama api.ChatRequest. It handles message content type conversion (string and array-of-parts formats), image URL decoding (data URIs and HTTP fetches), tool definition mapping, and options translation.
ChatMiddleware is the Gin middleware that orchestrates the translation: it parses the OpenAI request, calls FromChatRequest, replaces the request body with the translated Ollama request, and installs a custom ChatWriter that translates responses back.
FromCompleteRequest handles the legacy /v1/completions endpoint, and CompletionsMiddleware wraps it.
Usage
Invoked automatically by the middleware when requests arrive at OpenAI-compatible endpoints.
Code Reference
Source Location
- Repository: ollama
- File: openai/openai.go (FromChatRequest, FromCompleteRequest), middleware/openai.go (ChatMiddleware, CompletionsMiddleware)
- Lines: openai.go:L448-616 (FromChatRequest), openai.go:L680-763 (FromCompleteRequest), openai.go:L396-447 (ChatMiddleware), openai.go:L310-344 (CompletionsMiddleware)
Signature
func FromChatRequest(r ChatCompletionRequest) (*api.ChatRequest, error)
func FromCompleteRequest(r CompletionRequest) (api.GenerateRequest, error)
func ChatMiddleware() gin.HandlerFunc
func CompletionsMiddleware() gin.HandlerFunc
Import
import "github.com/ollama/ollama/openai"
import "github.com/ollama/ollama/middleware"
I/O Contract
Inputs (FromChatRequest)
| Name | Type | Required | Description |
|---|---|---|---|
| r | ChatCompletionRequest | Yes | OpenAI format request with Model, Messages, Tools, Temperature, TopP, MaxTokens, Stream, etc. |
Outputs (FromChatRequest)
| Name | Type | Description |
|---|---|---|
| *api.ChatRequest | *api.ChatRequest | Translated Ollama request with Messages (including decoded images), Tools, Options |
| error | error | Non-nil if message content parsing or image decoding fails |
Usage Examples
OpenAI Request with Images
curl http://localhost:11434/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "llava",
"messages": [{
"role": "user",
"content": [
{"type": "text", "text": "What is in this image?"},
{"type": "image_url", "image_url": {"url": "data:image/png;base64,..."}}
]
}]
}'