Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Implementation:Infiniflow Ragflow LLM Constants

From Leeroopedia
Knowledge Sources
Domains Frontend, LLM, Configuration
Last Updated 2026-02-12 06:00 GMT

Overview

Concrete comprehensive registry of 60+ supported LLM provider factory names and model type definitions for the RAGFlow frontend.

Description

The constants/llm.ts module defines the complete registry of supported LLM providers (OpenAI, Azure, Anthropic, Google, HuggingFace, Tongyi, DeepSeek, Ollama, etc.), model type enums (chat, embedding, image2text, speech2text, tts, reranking), and provider-specific configuration metadata.

Usage

Import these constants when building LLM provider selection UIs, model configuration forms, or any feature that needs to enumerate or validate LLM provider options.

Code Reference

Source Location

Signature

export enum LlmModelType {
  Chat = 'chat',
  Embedding = 'embedding',
  Image2Text = 'image2text',
  Speech2Text = 'speech2text',
  TTS = 'tts',
  Reranking = 'reranking',
}

export const LlmFactoryList: string[];  // 60+ provider names

Import

import { LlmModelType, LlmFactoryList } from '@/constants/llm';

I/O Contract

Inputs

Name Type Required Description
Module-level constants; no inputs

Outputs

Name Type Description
LlmModelType enum All LLM model type identifiers
LlmFactoryList string[] All supported LLM provider names

Usage Examples

import { LlmModelType, LlmFactoryList } from '@/constants/llm';

// Filter to chat models only
const chatProviders = LlmFactoryList.filter(p => supportsChatModel(p));

Related Pages

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment