Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Implementation:Bentoml BentoML Pytest Plugin

From Leeroopedia
Knowledge Sources
Domains Testing, Pytest
Last Updated 2026-02-13 15:00 GMT

Overview

A pytest plugin that provides BentoML-specific test fixtures, markers, session setup, and environment isolation for running BentoML integration and unit tests.

Description

The plugin.py module implements a comprehensive pytest plugin for the BentoML test suite. It provides:

Hooks and Markers:

  • pytest_report_header -- Adds the BentoML version to the pytest report header.
  • pytest_addoption -- Registers two CLI options: --run-gpu-tests and --run-grpc-tests for selectively enabling GPU and gRPC test suites.
  • pytest_configure -- Registers the custom markers requires_gpus and requires_grpc.
  • pytest_runtest_setup -- Skips tests marked with requires_gpus or requires_grpc when the corresponding CLI flags are not provided.
  • pytest_generate_tests -- Dynamically parametrizes tests that use the deployment_mode or model_store fixtures.

Session Lifecycle:

  • pytest_sessionstart -- Creates a temporary BentoML home directory, clears analytics caches, and monkey-patches environment variables (PROMETHEUS_MULTIPROC_DIR, BENTOML_BUNDLE_LOCAL_BUILD, BENTOML_DO_NOT_TRACK, BENTOML_HOME) to isolate the test environment.
  • pytest_sessionfinish -- Restores the original environment variables and resets the Prometheus multiproc directory.

Fixtures:

  • bentoml_home -- Returns the temporary BentoML home directory path (session scope).
  • clean_context -- Provides a session-scoped contextlib.ExitStack for cleanup of context managers.
  • img_file -- Creates a random BMP image file in a temporary directory.
  • bin_file -- Creates a random binary file in a temporary directory.
  • fixture_metrics_client -- Returns a PrometheusClient instance for testing metrics.
  • fixture_change_dir -- Changes the working directory to the test file's directory and restores it afterward.
  • fixture_reset_config -- Resets BentoMLContainer config to its pre-test state after each test function.

Deployment Mode Parametrization: The _setup_deployment_mode helper determines available deployment modes (standalone, distributed, container) based on the platform and CI environment (GitHub Actions, VSCode remote containers, GitHub Codespaces).

Usage

This plugin is automatically loaded when running pytest in a BentoML project. Use the --run-gpu-tests and --run-grpc-tests flags to enable specific test categories. Tests can request the provided fixtures (e.g., bentoml_home, clean_context, deployment_mode) in their function signatures.

Code Reference

Source Location

Signature

# Hooks
def pytest_report_header(config: Config) -> list[str]: ...
def pytest_addoption(parser: Parser) -> None: ...
def pytest_configure(config: Config) -> None: ...
def pytest_runtest_setup(item: Item) -> None: ...
def pytest_generate_tests(metafunc: Metafunc): ...
def pytest_sessionstart(session: Session) -> None: ...
def pytest_sessionfinish(session: Session, exitstatus: int | ExitCode) -> None: ...

# Fixtures
@pytest.fixture(scope="session")
def bentoml_home() -> str: ...

@pytest.fixture(scope="session", autouse=True)
def clean_context() -> t.Generator[contextlib.ExitStack, None, None]: ...

@pytest.fixture()
def img_file(tmpdir: str) -> str: ...

@pytest.fixture()
def bin_file(tmpdir: str) -> str: ...

@pytest.fixture(scope="module", name="prom_client")
def fixture_metrics_client() -> PrometheusClient: ...

@pytest.fixture(scope="function", name="change_test_dir")
def fixture_change_dir(request: FixtureRequest) -> t.Generator[None, None, None]: ...

@pytest.fixture(scope="function", autouse=True)
def fixture_reset_config() -> t.Generator[None, None, None]: ...

Import

# Typically auto-discovered by pytest via entry points.
# Manual import:
from bentoml.testing.pytest.plugin import pytest_sessionstart, pytest_addoption

I/O Contract

Inputs

Name Type Required Description
--run-gpu-tests CLI flag No Enable tests marked with requires_gpus
--run-grpc-tests CLI flag No Enable tests marked with requires_grpc
deployment_mode fixture param No Parametrized fixture: "standalone", "distributed", or "container"
model_store fixture param No Parametrized fixture providing BentoML model store

Outputs

Name Type Description
bentoml_home str Temporary BentoML home directory path for the test session
clean_context contextlib.ExitStack Session-scoped exit stack for context manager cleanup
img_file str Path to a generated random BMP image file
bin_file str Path to a generated random binary file
prom_client PrometheusClient Prometheus metrics client instance

Usage Examples

# Run tests with gRPC support enabled
# Command line: pytest --run-grpc-tests

# Use the bentoml_home fixture in a test
def test_my_service(bentoml_home):
    assert os.path.isdir(bentoml_home)

# Use the clean_context fixture
def test_with_cleanup(clean_context):
    ctx = clean_context.enter_context(some_context_manager())
    # ctx will be cleaned up after the session

# Use the deployment_mode parametrized fixture
def test_deployment(deployment_mode):
    # Runs once per deployment_mode: standalone, distributed, container
    assert deployment_mode in ("standalone", "distributed", "container")

# Mark a test as requiring GPU
import pytest

@pytest.mark.requires_gpus
def test_gpu_inference():
    pass

Related Pages

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment