Implementation:Langchain ai Langchain BaseChatOpenAI Bind Tools
| Knowledge Sources | |
|---|---|
| Domains | NLP, Tool_Use |
| Last Updated | 2026-02-11 00:00 GMT |
Overview
Concrete tool for binding tool schemas to OpenAI-compatible chat models provided by the LangChain OpenAI integration.
Description
The BaseChatOpenAI.bind_tools() method accepts a sequence of tool definitions (functions, Pydantic models, BaseTool instances, or raw dicts), converts each to OpenAI's tool schema format via convert_to_openai_tool(), and returns a new Runnable with tools bound as kwargs. The convert_to_openai_tool() utility handles the schema conversion, including optional strict mode for guaranteed schema compliance.
Usage
Call bind_tools() on any OpenAI-compatible chat model to enable tool calling. For Anthropic models, ChatAnthropic.bind_tools() follows the same interface but converts to Anthropic's tool format.
Code Reference
Source Location
- Repository: langchain
- File: libs/partners/openai/langchain_openai/chat_models/base.py (bind_tools), libs/core/langchain_core/utils/function_calling.py (convert_to_openai_tool)
- Lines: base.py L1867-1959 (bind_tools); function_calling.py L498-575 (convert_to_openai_tool)
Signature
def bind_tools(
self,
tools: Sequence[dict[str, Any] | type | Callable | BaseTool],
*,
tool_choice: dict | str | bool | None = None,
strict: bool | None = None,
parallel_tool_calls: bool | None = None,
response_format: _DictOrPydanticClass | None = None,
**kwargs: Any,
) -> Runnable[LanguageModelInput, AIMessage]:
def convert_to_openai_tool(
tool: Mapping[str, Any] | type[BaseModel] | Callable | BaseTool,
*,
strict: bool | None = None,
) -> dict[str, Any]:
Import
from langchain_openai import ChatOpenAI
# convert_to_openai_tool is used internally; import if needed:
from langchain_core.utils.function_calling import convert_to_openai_tool
I/O Contract
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
| tools | Sequence[dict or type or Callable or BaseTool] | Yes | Tool definitions to bind |
| tool_choice | dict or str or bool or None | No | Force tool usage ("auto", "any", "required", or tool name) |
| strict | bool or None | No | Enforce strict JSON schema compliance |
| parallel_tool_calls | bool or None | No | Allow parallel tool calls in one turn |
Outputs
| Name | Type | Description |
|---|---|---|
| return | Runnable[LanguageModelInput, AIMessage] | Model with tools bound, ready for invocation |
Usage Examples
Binding Tools
from langchain_openai import ChatOpenAI
from langchain_core.tools import tool
@tool
def get_weather(city: str) -> str:
"""Get weather for a city."""
return f"Sunny in {city}"
llm = ChatOpenAI(model="gpt-4o-mini")
llm_with_tools = llm.bind_tools([get_weather])
response = llm_with_tools.invoke("What's the weather in Paris?")
print(response.tool_calls)
# [{'name': 'get_weather', 'args': {'city': 'Paris'}, 'id': 'call_...'}]
Forcing Tool Choice
# Force the model to use a specific tool
llm_forced = llm.bind_tools(
[get_weather],
tool_choice="get_weather",
)
# Force the model to use any tool (must use at least one)
llm_any = llm.bind_tools(
[get_weather],
tool_choice="required",
)