Implementation:Ollama Ollama ConvertModel
| Knowledge Sources | |
|---|---|
| Domains | Model_Architecture, Format_Conversion |
| Last Updated | 2026-02-14 00:00 GMT |
Overview
Concrete tool for converting HuggingFace models to GGUF format with automatic architecture detection provided by the convert package.
Description
ConvertModel is the top-level conversion entry point. It reads config.json from the model directory filesystem, detects the architecture, dispatches to the appropriate converter, processes all tensors and metadata, and writes the complete GGUF file.
LoadModelMetadata performs the architecture detection step, parsing config.json and returning the architecture-specific converter along with the parsed tokenizer.
Usage
Called by CreateHandler when a FROM directive points to a SafeTensors model directory rather than an existing Ollama model.
Code Reference
Source Location
- Repository: ollama
- File: convert/convert.go
- Lines: L367-388 (ConvertModel), L255-361 (LoadModelMetadata)
Signature
func ConvertModel(fsys fs.FS, f *os.File) error
func LoadModelMetadata(fsys fs.FS) (ModelKV, *Tokenizer, error)
Import
import "github.com/ollama/ollama/convert"
I/O Contract
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
| fsys | fs.FS | Yes | Filesystem rooted at model directory (config.json, safetensors, tokenizer files) |
| f | *os.File | Yes | Output file for GGUF binary |
Outputs
| Name | Type | Description |
|---|---|---|
| error | error | Non-nil if architecture unsupported, files missing, or conversion fails |
| Side effect | GGUF file | Complete GGUF binary written to output file |
Usage Examples
Converting a HuggingFace Model
import (
"os"
"github.com/ollama/ollama/convert"
)
modelDir := os.DirFS("/path/to/huggingface/model")
outFile, _ := os.Create("model.gguf")
defer outFile.Close()
err := convert.ConvertModel(modelDir, outFile)
if err != nil {
// handle unsupported architecture or conversion error
}