Implementation:InternLM Lmdeploy BaseChatTemplate Messages2prompt
| Knowledge Sources | |
|---|---|
| Domains | NLP, Prompt_Engineering |
| Last Updated | 2026-02-07 15:00 GMT |
Overview
Concrete tool for converting OpenAI-format message lists into model-specific prompt strings provided by the LMDeploy library.
Description
The BaseChatTemplate class provides prompt formatting methods that apply model-specific tokens (system prefix, user/assistant markers, end-of-turn tokens) to raw messages. It uses a registry system (mmengine MODELS) where each model family registers its template, and lookup happens automatically based on model name.
Usage
This is called automatically during Pipeline inference. Import directly when debugging prompt formatting or building custom prompt engineering workflows outside the standard pipeline.
Code Reference
Source Location
- Repository: lmdeploy
- File: lmdeploy/model.py
- Lines: L110-191 (BaseChatTemplate class), L776-793 (get_chat_template)
Signature
class BaseChatTemplate:
def __init__(self, system='', meta_instruction='', eosys='',
user='', eoh='', assistant='', eoa='',
separator='', tool='', eotool='',
capability='chat', stop_words=None, **kwargs):
...
def messages2prompt(self, messages, sequence_start=True, **kwargs) -> str:
"""Convert OpenAI-format messages to a prompt string."""
...
def get_prompt(self, prompt, sequence_start=True) -> str:
"""Wrap a plain string prompt with chat template."""
...
Import
from lmdeploy.model import BaseChatTemplate, get_chat_template
I/O Contract
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
| messages | str or List[dict] | Yes | Plain string or OpenAI-format message list [{"role": "user", "content": "..."}] |
| sequence_start | bool | No | Whether this is the start of a conversation (default: True) |
Outputs
| Name | Type | Description |
|---|---|---|
| prompt | str | Formatted prompt string with model-specific tokens applied |
Usage Examples
Direct Template Usage
from lmdeploy.model import get_chat_template
# Get the template for InternLM2
template = get_chat_template('internlm2')
# Format OpenAI-style messages
messages = [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What is 2+2?"}
]
prompt = template.messages2prompt(messages)
print(prompt)
# Output includes <|im_start|>system\n...<|im_end|>\n<|im_start|>user\n...<|im_end|>\n<|im_start|>assistant\n