Implementation:Microsoft Semantic kernel InvokePromptAsync With KernelArguments
| Knowledge Sources | |
|---|---|
| Domains | AI_Orchestration, Template_Rendering |
| Last Updated | 2026-02-11 19:00 GMT |
Overview
Concrete tool for invoking a templated prompt with variable substitution through KernelArguments provided by the Microsoft Semantic Kernel library.
Description
Kernel.InvokePromptAsync with KernelArguments extends the basic prompt invocation by supporting variable substitution within the prompt template. Template variables use the Template:$variableName syntax, and their values are supplied through a KernelArguments dictionary. Before the prompt is sent to the AI service, the template engine resolves all variables by looking up their keys in the arguments, producing a fully rendered prompt string.
The KernelArguments type is a dictionary-like container that maps string keys to object values. It supports initialization through collection initializer syntax, making it concise to construct. In addition to template variables, KernelArguments can also carry PromptExecutionSettings that control the AI service's generation behavior (temperature, max tokens, etc.), making it a unified carrier for both template data and service configuration.
Usage
Use this overload whenever your prompt contains Template:$variable placeholders that need to be filled with dynamic values. This is the standard approach for parameterized prompts in Semantic Kernel and should be preferred over manual string concatenation.
Code Reference
Source Location
- Repository: semantic-kernel
- File:
dotnet/src/SemanticKernel.Core/KernelExtensions.cs:L1282-1297
Signature
public static Task<FunctionResult> InvokePromptAsync(
this Kernel kernel,
string promptTemplate,
KernelArguments? arguments = null,
string? templateFormat = null,
IPromptTemplateFactory? promptTemplateFactory = null,
CancellationToken cancellationToken = default)
Import
using Microsoft.SemanticKernel;
I/O Contract
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
| kernel | Kernel |
Yes | The kernel instance (implicit via extension method). |
| promptTemplate | string |
Yes | The prompt template containing Template:$variable placeholders to be resolved against the arguments. |
| arguments | KernelArguments? |
Yes (for templating) | A dictionary of variable names to values. Each key corresponds to a Template:$key placeholder in the template. May also contain PromptExecutionSettings. |
| templateFormat | string? |
No | Optional template format identifier. Defaults to the Semantic Kernel format. |
| promptTemplateFactory | IPromptTemplateFactory? |
No | Optional factory for the template renderer. |
| cancellationToken | CancellationToken |
No | Optional cancellation token. |
Outputs
| Name | Type | Description |
|---|---|---|
| return | Task<FunctionResult> |
An asynchronous task resolving to the AI response wrapped in a FunctionResult. The rendered prompt (with all variables substituted) is what gets sent to the AI service. |
Usage Examples
Single Variable Substitution
using Microsoft.SemanticKernel;
Kernel kernel = Kernel.CreateBuilder()
.AddOpenAIChatClient(
modelId: TestConfiguration.OpenAI.ChatModelId,
apiKey: TestConfiguration.OpenAI.ApiKey)
.Build();
// Create arguments with a single variable
KernelArguments arguments = new() { { "topic", "sea" } };
// The template {{$topic}} is replaced with "sea" before sending to the AI
Console.WriteLine(await kernel.InvokePromptAsync("What color is the {{$topic}}?", arguments));
Multiple Variable Substitution
using Microsoft.SemanticKernel;
KernelArguments arguments = new()
{
{ "animal", "cat" },
{ "activity", "sleeping" },
{ "location", "sunny windowsill" }
};
Console.WriteLine(await kernel.InvokePromptAsync(
"Write a short poem about a {{$animal}} {{$activity}} on a {{$location}}.",
arguments));
Arguments with Execution Settings
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.Connectors.OpenAI;
// KernelArguments can carry both template variables and execution settings
KernelArguments arguments = new(new OpenAIPromptExecutionSettings
{
MaxTokens = 500,
Temperature = 0.5
})
{
{ "topic", "dogs" }
};
Console.WriteLine(await kernel.InvokePromptAsync(
"Tell me a story about {{$topic}}",
arguments));