Principle:Ollama Ollama Local Manifest Management
| Knowledge Sources | |
|---|---|
| Domains | Storage, Model_Management |
| Last Updated | 2026-02-14 00:00 GMT |
Overview
A filesystem-based manifest management system that writes, reads, removes, and resolves model manifests and blob paths in a content-addressable storage layout.
Description
Local Manifest Management provides the CRUD operations for model manifests on the local filesystem. Manifests are stored in a hierarchical directory structure: ~/.ollama/models/manifests/{host}/{namespace}/{model}/{tag}. Blobs are stored in a flat directory: ~/.ollama/models/blobs/sha256-{hex}.
The system supports writing new manifests (after pull or create), removing manifests (for delete operations), listing all manifests (for the model list API), and resolving filesystem paths from model names and blob digests.
Usage
Use this principle when implementing a content-addressable model store that needs filesystem-based persistence with hierarchical naming.
Theoretical Basis
The storage layout:
~/.ollama/models/
├── manifests/
│ └── registry.ollama.ai/
│ └── library/
│ └── llama3/
│ └── latest (JSON manifest)
└── blobs/
├── sha256-abc123... (model weights)
├── sha256-def456... (template)
└── sha256-789ghi... (system prompt)
Operations:
- Write: Serialize manifest JSON and write to path derived from model name.
- Remove: Delete manifest file and optionally prune unreferenced blobs.
- Path Resolution: model.Name → filesystem path (manifests) or digest → blob path.