Implementation:Microsoft Semantic kernel InvokePromptAsync With Functions
| Knowledge Sources | |
|---|---|
| Domains | AI_Orchestration, Function_Calling, Prompt_Engineering |
| Last Updated | 2026-02-11 19:00 GMT |
Overview
Concrete tool for invoking a natural language prompt with automatic function calling enabled, allowing the AI model to autonomously select and execute registered plugin functions in Microsoft Semantic Kernel.
Description
Kernel.InvokePromptAsync() with FunctionChoiceBehavior configured in the KernelArguments combines prompt execution with AI-driven tool use. The same InvokePromptAsync API used for simple prompt invocation (see Workflow 1) gains function calling capabilities when the execution settings include a FunctionChoiceBehavior such as FunctionChoiceBehavior.Auto().
When function calling is enabled, the method:
- Creates a KernelFunction from the prompt template string.
- Invokes the function via Kernel.InvokeAsync, which sends the prompt along with JSON schemas for all registered plugin functions to the AI model.
- If the model responds with function call requests, the AI connector automatically executes the requested functions (when autoInvoke is true) and appends the results to the conversation.
- The connector sends the updated conversation back to the model and repeats until the model produces a final text response.
- Returns the final response as a FunctionResult.
The execution settings are passed through KernelArguments, which wraps PromptExecutionSettings (or a provider-specific subclass like OpenAIPromptExecutionSettings). The FunctionChoiceBehavior property on the settings controls the tool-use policy.
This is the same API signature as basic prompt invocation, making it easy to upgrade from simple prompts to function-calling-enabled prompts by simply adding execution settings to the arguments.
Usage
Use InvokePromptAsync with FunctionChoiceBehavior whenever you want the AI model to have access to registered tools while responding to a prompt. This is the primary mechanism for building tool-augmented conversational AI.
Code Reference
Source Location
- Repository: semantic-kernel
- File:
dotnet/src/SemanticKernel.Core/KernelExtensions.cs:L1238-1259 - Sample:
dotnet/samples/GettingStarted/Step2_Add_Plugins.cs:L40-42
Signature
public static Task<FunctionResult> InvokePromptAsync(
this Kernel kernel,
string promptTemplate,
KernelArguments? arguments = null,
string? templateFormat = null,
IPromptTemplateFactory? promptTemplateFactory = null,
PromptTemplateConfig? promptTemplateConfig = null,
CancellationToken cancellationToken = default)
Import
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.Connectors.OpenAI; // for OpenAIPromptExecutionSettings
I/O Contract
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
| kernel | Kernel |
Yes | The kernel containing registered AI services and plugins (provided via extension method receiver). |
| promptTemplate | string |
Yes | The natural language prompt template to send to the AI model. |
| arguments | KernelArguments? |
No | Arguments including PromptExecutionSettings with FunctionChoiceBehavior configured. When FunctionChoiceBehavior is set, function calling is enabled. |
| templateFormat | string? |
No | The format of the prompt template (e.g., "semantic-kernel", "handlebars"). Defaults to the built-in format. |
| promptTemplateFactory | IPromptTemplateFactory? |
No | Custom factory for interpreting the prompt template. |
| promptTemplateConfig | PromptTemplateConfig? |
No | Additional prompt template configuration. |
| cancellationToken | CancellationToken |
No | Token for cancelling the async operation. |
Outputs
| Name | Type | Description |
|---|---|---|
| return | Task<FunctionResult> |
The final result from the AI model after all function call rounds have completed. Contains the model's text response. |
Usage Examples
Basic Prompt with Auto Function Calling
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.Connectors.OpenAI;
// Build kernel with time plugin
IKernelBuilder kernelBuilder = Kernel.CreateBuilder();
kernelBuilder.AddOpenAIChatClient(modelId: "gpt-4o", apiKey: "your-api-key");
kernelBuilder.Plugins.AddFromType<TimeInformation>();
Kernel kernel = kernelBuilder.Build();
// Invoke prompt with function calling enabled
OpenAIPromptExecutionSettings settings = new()
{
FunctionChoiceBehavior = FunctionChoiceBehavior.Auto()
};
// The model will call TimeInformation.GetCurrentUtcTime to answer this question
Console.WriteLine(await kernel.InvokePromptAsync(
"How many days until Christmas? Explain your thinking.",
new(settings)));
Multiple Plugins with Complex Function Calling
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.Connectors.OpenAI;
// Build kernel with multiple plugins
IKernelBuilder kernelBuilder = Kernel.CreateBuilder();
kernelBuilder.AddOpenAIChatClient(modelId: "gpt-4o", apiKey: "your-api-key");
kernelBuilder.Plugins.AddFromType<TimeInformation>();
kernelBuilder.Plugins.AddFromType<WidgetFactory>();
Kernel kernel = kernelBuilder.Build();
OpenAIPromptExecutionSettings settings = new()
{
FunctionChoiceBehavior = FunctionChoiceBehavior.Auto()
};
// The model decides which plugin functions to call based on the prompt
Console.WriteLine(await kernel.InvokePromptAsync(
"Create a handy lime colored widget for me.", new(settings)));
Console.WriteLine(await kernel.InvokePromptAsync(
"Create an attractive maroon and navy colored widget for me.", new(settings)));
Contrasting Template-Based vs Function-Calling Approaches
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.Connectors.OpenAI;
// Approach 1: Template-based (deterministic, function called at render time)
Console.WriteLine(await kernel.InvokePromptAsync(
"The current time is {{TimeInformation.GetCurrentUtcTime}}. How many days until Christmas?"));
// Approach 2: Function calling (AI-driven, model decides to call the function)
OpenAIPromptExecutionSettings settings = new()
{
FunctionChoiceBehavior = FunctionChoiceBehavior.Auto()
};
Console.WriteLine(await kernel.InvokePromptAsync(
"How many days until Christmas?", new(settings)));