Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Implementation:Ollama Ollama Serve

From Leeroopedia
Knowledge Sources
Domains Systems, Networking
Last Updated 2026-02-14 00:00 GMT

Overview

Concrete tool for bootstrapping the Ollama HTTP inference server provided by the server package.

Description

The Serve function is the top-level entry point for the Ollama server. It configures logging, performs blob storage housekeeping (fixing blob permissions, pruning orphaned layers and empty directories), initializes the model scheduler, registers all HTTP API routes via GenerateRoutes, and starts the HTTP server on the provided listener. It also sets up graceful shutdown handling and heartbeat monitoring.

The companion GenerateRoutes method on Server registers all API endpoints using the Gin HTTP framework, including native Ollama routes (/api/generate, /api/chat, /api/pull, /api/push, /api/create), OpenAI-compatible routes (/v1/chat/completions, /v1/completions, /v1/models, /v1/embeddings), and Anthropic-compatible routes.

Usage

This is the entry point called by the ollama serve CLI command. It should be called once at application startup with a bound network listener.

Code Reference

Source Location

  • Repository: ollama
  • File: server/routes.go
  • Lines: L1661-1856 (Serve), L1560-1660 (GenerateRoutes)

Signature

func Serve(ln net.Listener) error
func (s *Server) GenerateRoutes(rc *ollama.Registry) (http.Handler, error)

Import

import "github.com/ollama/ollama/server"

I/O Contract

Inputs

Name Type Required Description
ln net.Listener Yes TCP listener for the server to accept connections on

Outputs

Name Type Description
error error Non-nil if server fails to start or encounters fatal initialization error
Side effect Running HTTP server Serves all registered API routes until context cancellation

Usage Examples

Starting the Server

package main

import (
    "net"
    "github.com/ollama/ollama/server"
)

func main() {
    ln, err := net.Listen("tcp", "127.0.0.1:11434")
    if err != nil {
        panic(err)
    }
    if err := server.Serve(ln); err != nil {
        panic(err)
    }
}

Related Pages

Implements Principle

Requires Environment

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment