Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Environment:Guardrails ai Guardrails OpenTelemetry Tracing

From Leeroopedia
Knowledge Sources
Domains Infrastructure, Observability
Last Updated 2026-02-14 12:00 GMT

Overview

OpenTelemetry SDK environment with OTLP exporters for Guardrails execution tracing and observability.

Description

This environment configures the OpenTelemetry tracing stack used by Guardrails AI for observability. It includes the OTel SDK, OTLP HTTP and gRPC exporters, and optional OpenInference semantic conventions. The tracer is implemented as a thread-safe singleton that auto-configures based on environment variables. When OTLP environment variables are set, traces are exported via BatchSpanProcessor to an OTLP endpoint; otherwise, traces are emitted to stderr via ConsoleSpanExporter using a SimpleSpanProcessor.

Usage

Use this environment when you need production observability for Guardrails validation pipelines. Required for the Settings_OpenTelemetry implementation and any workflow that needs distributed tracing. Recommended backend is Grafana with an OTLP collector.

System Requirements

Category Requirement Notes
Python >= 3.10 Same as base Guardrails requirement
Network Outbound HTTPS/gRPC To reach OTLP collector endpoint

Dependencies

Core Packages (included in guardrails-ai)

  • `opentelemetry-sdk` >= 1.24.0, < 2.0.0
  • `opentelemetry-exporter-otlp-proto-grpc` >= 1.24.0, < 2.0.0
  • `opentelemetry-exporter-otlp-proto-http` >= 1.24.0, < 2.0.0

Optional Packages

  • `openinference-semconv` — OpenInference semantic conventions for LLM span attributes. Imported with `try/except ImportError` and gracefully degrades to `None`.

Credentials

The following environment variables configure the OTLP exporter:

  • `OTEL_EXPORTER_OTLP_PROTOCOL`: Protocol for the OTLP exporter (e.g., `http/protobuf`, `grpc`). Required for batch export.
  • `OTEL_EXPORTER_OTLP_TRACES_ENDPOINT`: Specific endpoint for trace data (e.g., `https://otel-collector:4318/v1/traces`). Takes precedence over generic endpoint.
  • `OTEL_EXPORTER_OTLP_ENDPOINT`: Generic OTLP endpoint (e.g., `https://otel-collector:4318`). Used if traces endpoint is not set.
  • `OTEL_EXPORTER_OTLP_HEADERS`: Headers for OTLP requests (e.g., `Authorization=Bearer xxx`).

Note: If neither `OTEL_EXPORTER_OTLP_PROTOCOL` nor an endpoint variable is set, the tracer falls back to console output on stderr.

Quick Install

# OpenTelemetry packages are included in guardrails-ai core
pip install guardrails-ai

# Optional: OpenInference for LLM semantic conventions
pip install openinference-semconv

Code Evidence

Singleton tracer initialization with env var detection from `guardrails/telemetry/default_otlp_tracer_mod.py:36-54`:

def _initialize(self, resource_name: str):
    envvars_exist = os.environ.get("OTEL_EXPORTER_OTLP_PROTOCOL") and (
        os.environ.get("OTEL_EXPORTER_OTLP_TRACES_ENDPOINT")
        or os.environ.get("OTEL_EXPORTER_OTLP_ENDPOINT")
    )

    resource = Resource(attributes={SERVICE_NAME: resource_name})

    traceProvider = TracerProvider(resource=resource)
    span_exporter = OTLPSpanExporter()
    if envvars_exist:
        processor = BatchSpanProcessor(span_exporter=span_exporter)
    else:
        processor = SimpleSpanProcessor(ConsoleSpanExporter(out=sys.stderr))

    traceProvider.add_span_processor(processor)
    trace.set_tracer_provider(traceProvider)

    self.tracer = trace.get_tracer("guardrails-ai", GUARDRAILS_VERSION)

Optional OpenInference import from `guardrails/telemetry/guard_tracing.py:12-15`:

try:
    from openinference.semconv.trace import SpanAttributes
except ImportError:
    SpanAttributes = None

Common Errors

Error Message Cause Solution
Traces appear on stderr instead of collector OTLP env vars not set Set `OTEL_EXPORTER_OTLP_PROTOCOL` and endpoint variables
`Connection refused` to OTLP endpoint Collector not running Start your OTLP collector (e.g., Grafana Alloy, Jaeger)
Missing LLM span attributes `openinference-semconv` not installed `pip install openinference-semconv`

Compatibility Notes

  • gRPC vs HTTP: The codebase includes both gRPC and HTTP OTLP exporters. A TODO comment in the source notes that the choice between them should be configurable but currently defaults to HTTP.
  • Grafana: The official documentation recommends Grafana as the telemetry backend. See the Guardrails telemetry docs for a full Grafana setup example.
  • Thread Safety: The `DefaultOtlpTracer` uses a double-checked locking singleton pattern with `threading.Lock` for safe concurrent initialization.

Related Pages

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment