Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Environment:Marker Inc Korea AutoRAG Python 3 10 Runtime

From Leeroopedia
Knowledge Sources
Domains Infrastructure, RAG
Last Updated 2026-02-12 00:00 GMT

Overview

Python 3.10+ environment with core dependencies for running AutoRAG's RAG pipeline optimization, evaluation, and deployment features.

Description

This environment defines the base runtime requirements for AutoRAG. It targets Python 3.10, 3.11, or 3.12 on any platform. The core dependency set includes LlamaIndex for LLM orchestration, OpenAI client for API access, ChromaDB and five other vector database backends, Quart for async API serving, Gradio and Streamlit for web UIs, Panel for dashboards, and evaluation libraries (sacrebleu, rouge_score, evaluate). NumPy is pinned to 1.26.4 to avoid incompatibilities with NumPy 2.0.

Usage

Use this environment for all AutoRAG operations: data creation, pipeline optimization, deployment, and evaluation. This is the mandatory base environment. GPU-specific, Korean, Japanese, and parse extras build on top of this.

System Requirements

Category Requirement Notes
OS Linux, macOS, or Windows No platform-specific restrictions
Python >= 3.10, <= 3.12 Classifiers list 3.10, 3.11, 3.12
Disk 2GB+ free For package installation and index storage

Dependencies

Core Python Packages

  • `pydantic` >= 2.9.2
  • `numpy` == 1.26.4 (pinned; avoids NumPy 2.0 incompatibilities)
  • `pandas` >= 2.2.3
  • `tqdm` >= 4.67.1
  • `tiktoken` >= 0.9.0
  • `openai` >= 1.90.0
  • `rank_bm25` >= 0.2.2
  • `pyyaml` >= 6.0.2
  • `pyarrow` >= 20.0.0
  • `fastparquet` >= 2024.11.0
  • `scikit-learn` >= 1.7.0
  • `click` >= 8.2.1
  • `rich` >= 14.0.0

Evaluation Packages

  • `sacrebleu` >= 2.5.1
  • `evaluate` >= 0.4.4
  • `rouge_score` >= 0.1.2

LlamaIndex Ecosystem

  • `llama-index` >= 0.12.42
  • `llama-index-core` >= 0.12.42
  • `llama-index-readers-file` >= 0.4.9
  • `llama-index-embeddings-openai` >= 0.3.1
  • `llama-index-llms-openai` >= 0.4.7
  • `llama-index-llms-openai-like` >= 0.4.0
  • `llama-index-llms-bedrock` >= 0.3.8
  • `llama-index-retrievers-bm25` >= 0.5.2

Vector Database Backends

  • `chromadb` >= 1.0.0
  • `pymilvus` >= 2.6.0b0
  • `weaviate-client` >= 4.15.2
  • `pinecone[grpc]`
  • `couchbase` >= 4.4.0
  • `qdrant-client` >= 1.12.1

API Server and Web UI

  • `quart` >= 0.20.0
  • `pyngrok` >= 7.2.11
  • `fastapi` >= 0.115.13
  • `streamlit` >= 1.46.0
  • `gradio` >= 5.34.2
  • `panel` >= 1.7.1

LangChain (Pinned)

  • `langchain-core` == 0.2.43
  • `langchain-unstructured` == 0.1.2
  • `langchain-upstage` == 0.1.5
  • `langchain-community` == 0.2.19
  • `langsmith` == 0.1.147

Credentials

See Environment:Marker_Inc_Korea_AutoRAG_API_Keys_Configuration for all required API keys.

Quick Install

# Install base AutoRAG
pip install AutoRAG

# Or install with all extras (GPU + Korean + Japanese + Parse)
pip install "AutoRAG[all]"

Code Evidence

Python version requirement from `pyproject.toml:9`:

requires-python = ">=3.10"

NumPy pinning rationale from `pyproject.toml:26`:

"numpy==1.26.4",  # temporal not using numpy 2.0.0

Optional GPU import guard from `autorag/__init__.py:61-72`:

try:
    from llama_index.llms.huggingface import HuggingFaceLLM
    from llama_index.llms.ollama import Ollama
    generator_models["huggingfacellm"] = HuggingFaceLLM
    generator_models["ollama"] = Ollama
except ImportError:
    logger.info(
        "You are using API version of AutoRAG."
        "To use local version, run pip install 'AutoRAG[gpu]'"
    )

Common Errors

Error Message Cause Solution
`numpy 2.0 incompatibility` NumPy 2.0 installed Ensure `numpy==1.26.4` is installed; AutoRAG pins this version
`ImportError: llama_index` LlamaIndex not installed `pip install AutoRAG` installs all LlamaIndex dependencies
`grpcio version conflict` grpcio outside pinned range AutoRAG requires `grpcio>=1.66.2,<1.68.0`; check for conflicting packages

Compatibility Notes

  • Python 3.9 and below: Not supported. The `requires-python` field enforces >= 3.10.
  • NumPy 2.0: Explicitly excluded via pinned `numpy==1.26.4`. Upstream incompatibilities exist with pandas and other dependencies.
  • LangChain versions: Pinned to exact versions (e.g., `langchain-core==0.2.43`) due to breaking API changes across LangChain releases.
  • gRPC: Constrained to `>=1.66.2,<1.68.0` range due to compatibility with Pinecone and Milvus clients.

Related Pages

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment