Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Implementation:Mlflow Mlflow Pyfunc Load Model

From Leeroopedia
Knowledge Sources
Domains ML_Ops, Model_Management
Last Updated 2026-02-13 20:00 GMT

Overview

Concrete tool for loading a persisted MLflow model into a framework-agnostic prediction object provided by the MLflow library.

Description

mlflow.pyfunc.load_model downloads (if remote) or locates (if local) a model artifact identified by model_uri, deserialises the model implementation, validates Python version compatibility, checks pip dependency requirements, and returns a PyFuncModel instance with a unified .predict() method.

The function supports a wide range of URI schemes:

  • runs:/<run_id>/<artifact_name> -- load from a specific experiment run.
  • models:/<model_name>/<version> -- load a specific registered version.
  • models:/<model_name>@<alias> -- load via a registry alias such as "champion" or "production".
  • Local filesystem paths and remote artifact store URIs (S3, GCS, ADLS, etc.).

An optional model_config parameter allows runtime configuration to be injected at load time, making the model's behaviour tuneable without re-logging.

Usage

Use this function in any context where a previously logged or registered model must produce predictions: batch scoring scripts, REST serving endpoints, evaluation notebooks, or CI/CD validation pipelines. Prefer alias-based URIs in production to decouple deployment from version numbers.

Code Reference

Source Location

  • Repository: mlflow
  • File: mlflow/pyfunc/__init__.py
  • Lines: 1072-1208

Signature

def load_model(
    model_uri: str,
    suppress_warnings: bool = False,
    dst_path: str | None = None,
    model_config: str | Path | dict[str, Any] | None = None,
) -> PyFuncModel

Import

import mlflow.pyfunc

I/O Contract

Inputs

Name Type Required Description
model_uri str Yes The location of the MLflow model in URI format. Supported schemes include runs:/, models:/, s3://, mlflow-artifacts:/, and local filesystem paths.
suppress_warnings bool No If True, suppresses non-fatal warnings about dependency mismatches during loading. Defaults to False.
dst_path str or None No Local filesystem directory to which the model artifact is downloaded. Must already exist. If None, a temporary directory is created.
model_config str, Path, or dict[str, Any] No Runtime configuration made available to the model's load_context and predict methods. Can be a file path or a dictionary.

Outputs

Name Type Description
return value PyFuncModel A loaded model object exposing a .predict(data) method that accepts pandas DataFrames, dicts, lists, or numpy arrays and returns predictions. Also exposes .metadata and .model_meta for introspection.

Usage Examples

Basic Usage

import mlflow.pyfunc

# Load by run URI
model = mlflow.pyfunc.load_model("runs:/abc123def456/my-model")
predictions = model.predict({"feature_a": [1.0, 2.0], "feature_b": [3.0, 4.0]})

# Load by registered model name and version
model_v2 = mlflow.pyfunc.load_model("models:/MyModel/2")
predictions_v2 = model_v2.predict(test_dataframe)

# Load by registry alias (recommended for production)
champion = mlflow.pyfunc.load_model("models:/MyModel@champion")
champion_predictions = champion.predict(scoring_data)

# Load with runtime configuration
configured = mlflow.pyfunc.load_model(
    "models:/MyModel@champion",
    model_config={"temperature": 0.7, "max_tokens": 256},
)
configured_predictions = configured.predict(input_data)

Related Pages

Implements Principle

Requires Environment

Uses Heuristic

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment