Implementation:Langchain ai Langchain BaseChatModel Subclass
Overview
Concrete chat model implementation for the DeepSeek API, provided by the ChatDeepSeek class which extends BaseChatOpenAI from langchain-openai.
Description
ChatDeepSeek is a reference implementation of a LangChain chat model for an OpenAI-compatible API provider. It extends BaseChatOpenAI and customizes authentication, endpoint configuration, response processing, and tool-calling behavior for the DeepSeek API.
Key implementation details:
- Inheritance: Extends
BaseChatOpenAIrather thanBaseChatModeldirectly, inheriting all OpenAI-protocol handling. - Authentication: Uses
DEEPSEEK_API_KEYenvironment variable withsecret_from_env()factory. API base URL defaults tohttps://api.deepseek.com/v1. - Model validation: A
@model_validator(mode="after")constructs OpenAI sync and async clients with the provider's credentials and endpoint. - Response processing: Overrides
_create_chat_result()and_convert_chunk_to_generation_chunk()to extract DeepSeek-specificreasoning_contentfrom responses. - Error handling: Wraps
_generate()and_stream()to catchJSONDecodeErrorand provide meaningful error messages. - Tool calling: Overrides
bind_tools()andwith_structured_output()to switch to the DeepSeek beta API endpoint whenstrict=True. - Model profiles: Loads provider-specific model profiles from static data for capability detection.
Usage
Use ChatDeepSeek as a reference when implementing any new chat model that wraps an OpenAI-compatible API. The pattern of extending BaseChatOpenAI and overriding only the provider-specific behavior applies to many integrations.
Code Reference
Source Location: libs/partners/deepseek/langchain_deepseek/chat_models.py, Lines 46-542
Class Signature:
class ChatDeepSeek(BaseChatOpenAI):
"""DeepSeek chat model integration to access models hosted in DeepSeek's API."""
model_name: str = Field(alias="model")
"""The name of the model"""
api_key: SecretStr | None = Field(
default_factory=secret_from_env("DEEPSEEK_API_KEY", default=None),
)
"""DeepSeek API key"""
api_base: str = Field(
default_factory=from_env("DEEPSEEK_API_BASE", default=DEFAULT_API_BASE),
)
"""DeepSeek API base URL"""
@property
def _llm_type(self) -> str:
"""Return type of chat model."""
return "chat-deepseek"
@property
def lc_secrets(self) -> dict[str, str]:
"""A map of constructor argument names to secret ids."""
return {"api_key": "DEEPSEEK_API_KEY"}
Import:
from langchain_deepseek import ChatDeepSeek
I/O Contract
| Input | Type | Description |
|---|---|---|
model / model_name |
str |
DeepSeek model identifier (e.g., "deepseek-chat", "deepseek-reasoner")
|
api_key |
None | DeepSeek API key; falls back to DEEPSEEK_API_KEY env var
|
api_base |
str |
API base URL; defaults to https://api.deepseek.com/v1
|
temperature |
float |
Sampling temperature (inherited from BaseChatOpenAI)
|
max_tokens |
None | Maximum tokens to generate (inherited) |
messages (to invoke/stream) |
list[BaseMessage] |
Input conversation messages |
| Output | Type | Description |
|---|---|---|
invoke() result |
AIMessage |
Complete model response with content, tool calls, usage metadata |
stream() chunks |
Iterator[AIMessageChunk] |
Streaming token chunks |
reasoning_content |
str (in additional_kwargs) |
Chain-of-thought reasoning (DeepSeek-R1 models) |
usage_metadata |
dict |
Token usage: input_tokens, output_tokens, total_tokens |
Usage Examples
Basic invocation:
from langchain_deepseek import ChatDeepSeek
model = ChatDeepSeek(
model="deepseek-chat",
temperature=0,
max_tokens=100,
)
messages = [
("system", "You are a helpful translator. Translate to French."),
("human", "I love programming."),
]
response = model.invoke(messages)
print(response.content)
Streaming:
for chunk in model.stream(messages):
print(chunk.text, end="")
Tool calling:
from pydantic import BaseModel, Field
class GetWeather(BaseModel):
"""Get the current weather in a given location."""
location: str = Field(..., description="The city and state, e.g. San Francisco, CA")
model_with_tools = model.bind_tools([GetWeather])
response = model_with_tools.invoke("What's the weather in Paris?")
print(response.tool_calls)
Structured output:
class Joke(BaseModel):
"""Joke to tell user."""
setup: str = Field(description="The setup of the joke")
punchline: str = Field(description="The punchline to the joke")
structured_model = model.with_structured_output(Joke)
result = structured_model.invoke("Tell me a joke about cats")
print(result.setup)
print(result.punchline)
Strict mode (beta endpoint):
# Uses https://api.deepseek.com/beta for strict schema validation
structured_model = model.with_structured_output(Joke, strict=True)