Workflow:Mistralai Client python Function Calling
| Knowledge Sources | |
|---|---|
| Domains | LLMs, Function_Calling, Tool_Use, Python_SDK |
| Last Updated | 2026-02-15 14:00 GMT |
Overview
End-to-end process for enabling Mistral AI models to invoke external functions (tools) during chat completion, allowing the model to access real-time data or perform actions.
Description
This workflow demonstrates the function calling (tool use) capability of Mistral AI models. The model can decide to call one or more developer-defined functions when a user query requires external data or actions. The developer defines tool schemas (function name, description, parameters), the model generates structured JSON arguments for the appropriate function, the developer executes the function and returns results, and the model incorporates those results into its final response. This enables building AI agents that can look up databases, call APIs, perform calculations, and more.
Usage
Execute this workflow when building applications where the model needs to access external data, perform calculations, interact with APIs, or take real-world actions. Common scenarios include customer support bots that look up account information, assistants that check weather or stock prices, and applications requiring structured data retrieval.
Execution Steps
Step 1: Define Tool Functions and Schemas
Implement the actual Python functions that the model can invoke, and create corresponding tool schema definitions. Each schema includes the function name, a natural language description, and a JSON Schema specification of the expected parameters.
Key considerations:
- Tool definitions use the Tool model with a nested Function containing name, description, and parameters
- Parameters follow JSON Schema format with type, properties, and required fields
- Function descriptions should be clear enough for the model to understand when to use each tool
- Map function names to implementations using a dictionary for easy dispatch
Step 2: Send Initial Request with Tool Definitions
Make a chat completion request that includes both the user message and the list of available tool definitions. The model analyzes the user's intent and decides whether to call a tool or respond directly.
Key considerations:
- Pass the tools parameter with the list of Tool objects
- Set temperature to 0 for deterministic tool selection
- The tool_choice parameter controls whether the model must call a tool (any/required), may call a tool (auto), or must not (none)
- Parallel tool calls can be enabled for models that support it
Step 3: Detect and Parse Tool Calls
Check the model's response for tool call requests. When the model decides to use a tool, the response contains tool_calls with the function name and serialized JSON arguments instead of (or in addition to) text content.
Key considerations:
- Tool calls are found in response.choices[0].message.tool_calls
- Each tool call has an id, function.name, and function.arguments (JSON string)
- Parse arguments with json.loads() to get a Python dictionary
- The model may request multiple tool calls in a single response
Step 4: Execute Functions and Collect Results
Look up and invoke the requested function(s) with the parsed arguments. Collect the return values to send back to the model as tool results.
Key considerations:
- Use a name-to-function mapping dictionary for dispatch
- Handle potential errors in function execution gracefully
- Results should be serializable as strings for the ToolMessage content
Step 5: Return Results and Get Final Response
Append the assistant's message (including the tool_calls) and a ToolMessage containing each function's result to the conversation history. Send a follow-up request so the model can incorporate the tool results into a natural language response for the user.
Key considerations:
- The AssistantMessage must include the original tool_calls for context
- Each ToolMessage must reference the corresponding tool_call_id
- The model uses the tool results to formulate its final answer
- Multiple rounds of tool calling may occur in complex scenarios