Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Principle:Protectai Llm guard API Application Factory

From Leeroopedia
Knowledge Sources
Domains API_Design, Web_Development, DevOps
Last Updated 2026-02-14 12:00 GMT

Overview

A factory pattern that constructs a fully configured FastAPI application with scanner pipelines, authentication, rate limiting, CORS, and observability instrumentation from a YAML configuration file.

Description

The API application factory pattern encapsulates all server initialization logic into a single function that reads configuration, instantiates scanners, configures middleware, registers routes, and instruments the application for observability. This pattern enables clean separation between configuration and execution, supporting both eager and lazy scanner loading.

Usage

Use this principle when deploying LLM Guard as a REST API service. The factory pattern allows the same application to be configured differently for development (debug mode, console logging) and production (auth enabled, OTLP tracing).

Theoretical Basis

# Pseudocode for API application factory
def create_app():
    config = load_config(config_file)
    configure_logging(config)
    configure_observability(config)
    scanners = load_scanners(config)
    app = FastAPI(title=config.name)
    register_middleware(app, config)  # CORS, rate limiting, auth
    register_routes(app, scanners)   # /analyze/prompt, /analyze/output, etc.
    instrument_app(app)              # OpenTelemetry
    return app

Related Pages

Implemented By

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment