Implementation:Langgenius Dify SendChatMessage
| Knowledge Sources | |
|---|---|
| Domains | Real-Time Streaming REST API Server-Sent Events |
| Last Updated | 2026-02-08 00:00 GMT |
Overview
Concrete tool for sending debug messages and receiving streaming responses during application preview provided by the Dify platform.
Description
The debug service module provides several functions for interacting with an application during the debug/preview phase. The primary functions are:
1. sendCompletionMessage -- Sends a completion request to the application and receives the response via Server-Sent Events (SSE) streaming. The function automatically sets response_mode: 'streaming' and delegates to the ssePost utility, which manages the SSE connection lifecycle. Four callback hooks allow the caller to react to streaming events in real time.
2. stopChatMessageResponding -- Sends a stop signal to terminate an in-progress generation. This is useful when the user decides mid-stream that the response is not useful or when a timeout occurs.
3. fetchSuggestedQuestions -- Retrieves AI-generated follow-up question suggestions for a specific message, enabling the suggested-questions-after-answer feature.
4. fetchConversationMessages -- Retrieves the full message history for a conversation, used to restore conversation state when returning to a debug session.
Usage
Use these functions when:
- Implementing the debug chat panel's send/receive logic
- Building the streaming response renderer
- Adding stop-generation functionality to the UI
- Loading conversation history on debug panel mount
Code Reference
Source Location
- Repository: Dify
- File:
web/service/debug.ts(Lines 28-64)
Signature
// Send a streaming completion message -- POST /apps/{appId}/completion-messages (SSE)
export const sendCompletionMessage = async (
appId: string,
body: Record<string, any>,
{ onData, onCompleted, onError, onMessageReplace }: {
onData: IOnData
onCompleted: IOnCompleted
onError: IOnError
onMessageReplace: IOnMessageReplace
}
) => {
return ssePost(`apps/${appId}/completion-messages`, {
body: {
...body,
response_mode: 'streaming',
},
}, { onData, onCompleted, onError, onMessageReplace })
}
// Stop an in-progress generation -- POST /apps/{appId}/chat-messages/{taskId}/stop
export const stopChatMessageResponding = async (appId: string, taskId: string) => {
return post(`apps/${appId}/chat-messages/${taskId}/stop`)
}
// Fetch suggested follow-up questions -- GET /apps/{appId}/chat-messages/{messageId}/suggested-questions
export const fetchSuggestedQuestions = (
appId: string,
messageId: string,
getAbortController?: any
) => {
return get(
`apps/${appId}/chat-messages/${messageId}/suggested-questions`,
{},
{ getAbortController }
)
}
// Fetch conversation message history -- GET /apps/{appId}/chat-messages
export const fetchConversationMessages = (
appId: string,
conversation_id: string,
getAbortController?: any
) => {
return get(`apps/${appId}/chat-messages`, {
params: { conversation_id },
}, { getAbortController })
}
Import
import {
sendCompletionMessage,
stopChatMessageResponding,
fetchSuggestedQuestions,
fetchConversationMessages,
} from '@/service/debug'
I/O Contract
Inputs (sendCompletionMessage)
| Name | Type | Required | Description |
|---|---|---|---|
| appId | string |
Yes | The application ID to send the message to |
| body | Record<string, any> |
Yes | Request payload including inputs, query, conversation_id, and model configuration overrides |
| onData | IOnData |
Yes | Callback invoked for each streaming data chunk |
| onCompleted | IOnCompleted |
Yes | Callback invoked when the stream completes |
| onError | IOnError |
Yes | Callback invoked when an error occurs |
| onMessageReplace | IOnMessageReplace |
Yes | Callback invoked when the server sends a message replacement event |
Inputs (stopChatMessageResponding)
| Name | Type | Required | Description |
|---|---|---|---|
| appId | string |
Yes | The application ID |
| taskId | string |
Yes | The task/message ID of the generation to stop |
Outputs
| Name | Type | Description |
|---|---|---|
| sendCompletionMessage | void (callbacks) | Response data is delivered via the onData/onCompleted/onError/onMessageReplace callbacks, not as a return value |
| stopChatMessageResponding | Promise<any> |
Confirmation that the stop signal was received |
| fetchSuggestedQuestions | Promise<any> |
Array of suggested follow-up questions |
| fetchConversationMessages | Promise<any> |
Array of message objects for the given conversation |
Usage Examples
import { sendCompletionMessage, stopChatMessageResponding } from '@/service/debug'
let responseText = ''
let currentTaskId = ''
// Send a streaming debug message
sendCompletionMessage(appId, {
inputs: { topic: 'machine learning' },
query: 'Explain gradient descent',
conversation_id: conversationId,
}, {
onData: (message, isFirstMessage, { taskId }) => {
responseText += message
currentTaskId = taskId
updateUI(responseText)
},
onCompleted: () => {
finalizeResponse(responseText)
},
onError: (error) => {
showErrorToast(error.message)
},
onMessageReplace: (newMessage) => {
responseText = newMessage
updateUI(responseText)
},
})
// Stop generation if the user clicks "Stop"
const handleStop = () => {
stopChatMessageResponding(appId, currentTaskId)
}