Implementation:Mlflow Mlflow Pyfunc Log Model
| Knowledge Sources | |
|---|---|
| Domains | ML_Ops, Model_Management |
| Last Updated | 2026-02-13 20:00 GMT |
Overview
Concrete tool for logging a trained model as a PyFunc artifact provided by the MLflow library.
Description
mlflow.pyfunc.log_model serialises a trained model and records it as an artifact of the currently active MLflow run. The function supports three workflows: passing a PythonModel subclass instance, passing a callable (functional model), or referencing a Python source file that defines the model (model-from-code). In every case the result is a directory containing the serialised model, an MLmodel descriptor, a conda.yaml / requirements.txt for environment reproduction, and optionally an input example and model signature.
When the registered_model_name parameter is supplied, the function additionally creates or updates a version in the model registry, combining the logging and registration steps into a single call. The returned ModelInfo object contains the model_uri that downstream consumers use to load the model.
Usage
Use this function inside an active MLflow run after training a model. It is the primary entry point for persisting custom Python models that implement the PyFunc interface. For native framework models (scikit-learn, PyTorch, etc.), flavour-specific loggers (e.g., mlflow.sklearn.log_model) are also available and will additionally record framework-specific metadata.
Code Reference
Source Location
- Repository: mlflow
- File:
mlflow/pyfunc/__init__.py - Lines: 3391-3654
Signature
def log_model(
artifact_path=None,
loader_module=None,
data_path=None,
code_paths=None,
infer_code_paths=False,
conda_env=None,
python_model=None,
artifacts=None,
registered_model_name=None,
signature: ModelSignature = None,
input_example: ModelInputExample = None,
await_registration_for=DEFAULT_AWAIT_MAX_SLEEP_SECONDS,
pip_requirements=None,
extra_pip_requirements=None,
metadata=None,
model_config=None,
streamable=None,
resources: str | list[Resource] | None = None,
auth_policy: AuthPolicy | None = None,
prompts: list[str | Prompt] | None = None,
name=None,
params: dict[str, Any] | None = None,
tags: dict[str, Any] | None = None,
model_type: str | None = None,
step: int = 0,
model_id: str | None = None,
) -> ModelInfo
Import
import mlflow.pyfunc
I/O Contract
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
| python_model | PythonModel, callable, or str | Yes (one workflow) | The model to log. A PythonModel subclass instance, a callable with a single argument, or a file path to a model-from-code script. |
| name | str | Recommended | The artifact name under which the model is stored within the run. |
| signature | ModelSignature | No | Describes the model's expected input and output schema. Auto-inferred from type annotations when available. |
| input_example | ModelInputExample | No | A representative input used for documentation and schema validation. |
| registered_model_name | str | No | If provided, the model is automatically registered under this name in the model registry. |
| artifact_path | str | No | Deprecated. Use name instead. |
| artifacts | dict[str, str] | No | Dictionary mapping artifact names to URIs; resolved to local paths at load time. |
| conda_env | str or dict | No | Conda environment specification for the model. |
| pip_requirements | str or list[str] | No | Pip requirements for the model environment. |
| extra_pip_requirements | str or list[str] | No | Additional pip requirements appended to the auto-detected set. |
| model_config | str, Path, or dict | No | Configuration dictionary or file path available at load and predict time. |
| tags | dict[str, Any] | No | Key-value tags to attach to the logged model. |
| model_id | str or None | No | Optional explicit model ID for the logged model. |
Outputs
| Name | Type | Description |
|---|---|---|
| return value | ModelInfo | Metadata object containing model_uri (e.g., runs:/<run_id>/model), model_id, flavors, signature, and other logged model details. |
Usage Examples
Basic Usage
import mlflow
import mlflow.pyfunc
from mlflow.models import infer_signature
class UpperModel(mlflow.pyfunc.PythonModel):
def predict(self, context, model_input, params=None):
return [s.upper() for s in model_input]
with mlflow.start_run():
signature = infer_signature(["hello"], ["HELLO"])
model_info = mlflow.pyfunc.log_model(
name="upper-model",
python_model=UpperModel(),
signature=signature,
input_example=["hello"],
)
print(model_info.model_uri)
# Output: runs:/<run_id>/upper-model
# Load and use the model
loaded = mlflow.pyfunc.load_model(model_info.model_uri)
print(loaded.predict(["world"]))
# Output: ["WORLD"]