Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Principle:Langgenius Dify Prompt Template Design

From Leeroopedia
Knowledge Sources Domains Last Updated
Dify LLM_Applications, Frontend, API 2026-02-12 00:00 GMT

Overview

Description

Prompt Template Design is the principle governing how Dify applications structure and manage the instructions sent to the underlying LLM. The prompt template is the primary interface between the developer's intent and the model's behavior, defining the system instructions, user message patterns, variable placeholders, and conversation history formatting.

Dify provides two distinct prompt configuration structures based on the model mode:

  • ChatPromptConfig -- Used for chat-mode models. Contains an array of PromptItem objects, each with a role (system, user, or assistant) and text content. This maps directly to the message-based API format used by modern chat models.
  • CompletionPromptConfig -- Used for completion-mode models. Contains a single PromptItem (the prompt text) plus conversation_histories_role configuration that defines the prefixes for user and assistant turns when formatting conversation history into a single text block.

The platform also supports two prompt authoring modes:

  • Simple Mode (PromptMode.simple) -- The developer writes a single prompt template string with variable placeholders. The platform handles the formatting automatically.
  • Advanced Mode (PromptMode.advanced) -- The developer has full control over the message array structure, including system prompts, few-shot examples, and conversation history insertion points.

Prompt templates support variable interpolation through PromptVariable definitions, which allow dynamic content injection at runtime. Variables can be of type string, number, or select (enumerated dropdown), and may be marked as required.

The fetchPromptTemplate function retrieves recommended prompt templates from the backend based on the application mode, model mode, model name, and whether a knowledge base (dataset) is connected. This allows the platform to generate contextually appropriate prompt scaffolding.

The useDebugConfigurationContext React context maintains the full prompt state during the configuration workflow, including the current prompt mode, advanced prompt content, conversation history roles, and block status tracking (context, history, query insertion points).

Usage

Prompt Template Design is used whenever a developer needs to:

  • Define or modify the system instructions that govern the LLM's behavior.
  • Create variable-driven prompt templates for dynamic content injection.
  • Switch between simple and advanced prompt authoring modes.
  • Fetch recommended prompt templates based on the current application and model configuration.
  • Configure conversation history formatting for completion-mode models.
  • Manage the prompt editing state within the Dify configuration UI.

Theoretical Basis

Prompt Template Design implements the Template Method Pattern: the platform defines the skeleton of the prompt structure (system message, user input, context injection, history formatting) while allowing developers to customize the specific content at each insertion point.

The dual configuration model (ChatPromptConfig vs. CompletionPromptConfig) reflects the fundamental architectural divergence in LLM APIs:

  • Message-based APIs (chat models) accept structured arrays of role-tagged messages, enabling fine-grained control over the conversation context.
  • Text-based APIs (completion models) accept a single text string, requiring the framework to handle serialization of conversation history and role demarcation.

The variable interpolation system follows the Dependency Injection principle at the prompt level: rather than hardcoding values into templates, variables are declared as typed parameters and injected at runtime, enabling reuse across different execution contexts.

The useDebugConfigurationContext implements the State Management pattern using React Context, providing a centralized store for the entire prompt configuration state that can be accessed by any component in the configuration UI tree without prop drilling.

Related Pages

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment