Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Environment:Vibrantlabsai Ragas LLM Provider Credentials

From Leeroopedia
Knowledge Sources
Domains Infrastructure, LLM_Evaluation, Credentials
Last Updated 2026-02-12 10:00 GMT

Overview

Credential environment for configuring LLM and embedding provider API keys required by Ragas evaluation and test generation workflows.

Description

Ragas supports multiple LLM providers through its adapter system. Each provider requires specific API keys or credentials to be set as environment variables. The default provider is OpenAI, but the framework also supports Anthropic, Google (Gemini/Vertex AI), Amazon Bedrock, Oracle Cloud (OCI), and any LiteLLM-compatible provider. This environment page documents all credential variables referenced across the codebase.

Usage

Use this environment whenever running Ragas evaluations, test data generation, or experiments that call LLM APIs. The specific credentials required depend on which LLM provider you configure. At minimum, one LLM provider key is needed for LLM-based metrics, and optionally an embedding provider key for embedding-based metrics.

System Requirements

Category Requirement Notes
Network Internet access Required for API calls to LLM providers
OS Any Environment variables work across all platforms

Dependencies

Python Packages

Provider-specific packages (install only for your provider):

  • OpenAI: `openai` >= 1.0.0 (included in core)
  • Anthropic: Via LangChain or LiteLLM
  • Google Gemini: `google-genai` (recommended) or `google-generativeai` (deprecated, EOL Aug 2025)
  • Google Vertex AI: `google-cloud-aiplatform`
  • Amazon Bedrock: Via LangChain Bedrock integration
  • Oracle OCI: `oci` >= 2.160.1
  • LiteLLM (universal): `litellm`

Credentials

LLM Provider Keys:

  • `OPENAI_API_KEY`: OpenAI API key. Required for the default LLM/embedding provider.
  • `ANTHROPIC_API_KEY`: Anthropic API key for Claude models.
  • `GOOGLE_API_KEY`: Google API key for Gemini models.

Google Cloud (Vertex AI):

  • `VERTEXAI_PROJECT`: Google Cloud project ID for Vertex AI.
  • `VERTEXAI_LOCATION`: Google Cloud region (e.g., `us-central1`).

Amazon Web Services:

  • `AWS_REGION_NAME`: AWS region for Bedrock service.
  • `AWS_ACCESS_KEY_ID`: AWS access key (or use IAM roles).
  • `AWS_SECRET_ACCESS_KEY`: AWS secret key (or use IAM roles).

Oracle Cloud Infrastructure:

  • `OCI_MODEL_ID`: OCI Generative AI model identifier.
  • `OCI_COMPARTMENT_ID`: OCI compartment OCID.
  • `OCI_ENDPOINT_ID`: OCI dedicated endpoint ID (optional).

Observability:

  • `MLFLOW_HOST`: MLflow tracking server hostname (required if using MLflow tracing).
  • `MLFLOW_TRACKING_URI`: Full MLflow tracking URI (used in examples).

WARNING: Never commit actual API key values to source control. Use `.env` files (which are git-ignored) or secret management systems.

Quick Install

# Set OpenAI credentials (default provider)
export OPENAI_API_KEY="your-key-here"

# Or for Google Gemini
export GOOGLE_API_KEY="your-key-here"

# Or for Vertex AI
export VERTEXAI_PROJECT="your-project-id"
export VERTEXAI_LOCATION="us-central1"

Code Evidence

OpenAI key usage in test utilities from `tests/utils/llm_setup.py:25`:

OPENAI_API_KEY = "OPENAI_API_KEY"
ANTHROPIC_API_KEY = "ANTHROPIC_API_KEY"

MLflow host requirement from `src/ragas/integrations/tracing/mlflow.py:47-49`:

host = os.getenv("MLFLOW_HOST")
if host is None:
    raise ValueError("MLFLOW_HOST environment variable must be set")

Google embeddings import handling from `src/ragas/embeddings/google_provider.py:197-204`:

raise ImportError(
    "Google AI (Gemini) embeddings require either:\n"
    "  - google-genai (recommended): pip install google-genai\n"
    "  - google-generativeai (deprecated): pip install google-generativeai"
)

OCI SDK detection from `src/ragas/llms/oci_genai_wrapper.py:76-83`:

if (
    self.client is None
    and GenerativeAiClient is None
    and self.endpoint_id is None
):
    raise ImportError(
        "OCI SDK not found. Please install it with: pip install oci"
    )

Common Errors

Error Message Cause Solution
`ValueError: MLFLOW_HOST environment variable must be set` MLflow tracing enabled without MLFLOW_HOST `export MLFLOW_HOST="your-mlflow-server"`
`AuthenticationError: Incorrect API key` Invalid or expired OpenAI API key Verify key at platform.openai.com and re-export
`ImportError: OCI SDK not found` OCI integration used without SDK `pip install oci>=2.160.1`
`ImportError: Google AI (Gemini) embeddings require...` Missing Google SDK for embeddings `pip install google-genai` (recommended)

Compatibility Notes

  • Google Gemini: Two SDK versions exist. The new `google-genai` SDK is recommended. The old `google-generativeai` SDK is deprecated with EOL August 2025.
  • Google Gemini + Instructor: Known upstream bug with safety settings (tracked at github.com/567-labs/instructor/issues/1658). Workaround: use OpenAI-compatible endpoint with Gemini base URL.
  • LiteLLM: Acts as a universal adapter supporting 100+ providers. Use when your provider is not directly supported.

Related Pages

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment