Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Implementation:Ollama Ollama GenerateRoutes OpenAI

From Leeroopedia
Knowledge Sources
Domains API_Design, Networking
Last Updated 2026-02-14 00:00 GMT

Overview

Concrete tool for registering OpenAI-compatible API routes with translation middleware provided by the server package.

Description

The OpenAI route registration section of GenerateRoutes maps OpenAI endpoint paths to Ollama's native handlers wrapped in translation middleware:

  • /v1/chat/completionsChatMiddleware + ChatHandler
  • /v1/completionsCompletionsMiddleware + GenerateHandler
  • /v1/embeddingsEmbeddingsMiddleware + EmbedHandler
  • /v1/modelsListMiddleware + ListHandler

Each middleware function translates the request format before the handler and the response format after.

Usage

Automatically registered when the Ollama server starts. No additional configuration required. Clients use the standard OpenAI base URL pattern: http://localhost:11434/v1/

Code Reference

Source Location

  • Repository: ollama
  • File: server/routes.go
  • Lines: L1560-1660 (GenerateRoutes, OpenAI route section)

Signature

func (s *Server) GenerateRoutes(rc *ollama.Registry) (http.Handler, error)

Import

import "github.com/ollama/ollama/server"

I/O Contract

Inputs

Name Type Required Description
rc *ollama.Registry No Optional registry client for model operations

Outputs

Name Type Description
http.Handler http.Handler Configured HTTP handler with all routes including OpenAI-compatible endpoints
error error Non-nil if route setup fails

Usage Examples

Client Usage with OpenAI Python Library

from openai import OpenAI

# Point OpenAI client at Ollama
client = OpenAI(
    base_url="http://localhost:11434/v1/",
    api_key="ollama",  # any string works
)

response = client.chat.completions.create(
    model="llama3",
    messages=[
        {"role": "user", "content": "Hello!"}
    ]
)
print(response.choices[0].message.content)

curl Example

curl http://localhost:11434/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "llama3",
    "messages": [{"role": "user", "content": "Hello!"}]
  }'

Related Pages

Implements Principle

Requires Environment

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment