Implementation:Ollama Ollama DownloadBlob
| Knowledge Sources | |
|---|---|
| Domains | Networking, Storage |
| Last Updated | 2026-02-14 00:00 GMT |
Overview
Concrete tool for downloading and uploading model blobs with parallel chunked transfer provided by the server package.
Description
downloadBlob handles downloading a single blob by digest. It checks the local cache first, then downloads using HTTP with range request support. Returns a boolean indicating cache hit.
uploadBlob handles uploading a blob to the registry, attempting cross-repository mount first for deduplication, then falling back to chunked upload.
PullModel orchestrates the complete pull flow using downloadBlob for each layer.
Usage
Called internally by PullModel and PushModel for individual blob transfers.
Code Reference
Source Location
- Repository: ollama
- File: server/download.go (downloadBlob), server/upload.go (uploadBlob)
- Lines: download.go:L468-509 (downloadBlob), upload.go:L369-405 (uploadBlob)
Signature
func downloadBlob(ctx context.Context, opts downloadOpts) (cacheHit bool, _ error)
func uploadBlob(ctx context.Context, n model.Name, layer manifest.Layer, opts *registryOptions, fn func(api.ProgressResponse)) error
Import
import "github.com/ollama/ollama/server"
I/O Contract
Inputs (downloadBlob)
| Name | Type | Required | Description |
|---|---|---|---|
| ctx | context.Context | Yes | Cancellation context |
| opts.digest | string | Yes | SHA-256 blob digest to download |
| opts.n | model.Name | Yes | Model name (for registry URL construction) |
| opts.regOpts | *registryOptions | Yes | Registry authentication options |
| opts.fn | func(api.ProgressResponse) | Yes | Progress callback |
Outputs (downloadBlob)
| Name | Type | Description |
|---|---|---|
| cacheHit | bool | True if blob was already cached locally |
| error | error | Non-nil if download fails |
| Side effect | Blob file | Downloaded to ~/.ollama/models/blobs/sha256-<hash> |
Usage Examples
Pull a Model (uses downloadBlob internally)
# Downloads each blob layer
ollama pull llama3:latest
# Push a model (uses uploadBlob internally)
ollama push my-namespace/my-model:latest