Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Principle:Openai Openai node Chat Completion Invocation

From Leeroopedia
Knowledge Sources
Domains NLP, API_Design
Last Updated 2026-02-15 00:00 GMT

Overview

A principle governing the invocation of language model inference via HTTP API calls that accept a structured request and return generated text.

Description

Chat Completion Invocation is the core operation of sending a request to a language model endpoint and receiving a generated response. The invocation handles the HTTP lifecycle: constructing the request, sending it via POST to the /chat/completions endpoint, and parsing the response. The SDK wraps this in an APIPromise for lazy execution, meaning the HTTP call is only made when the promise is awaited.

The invocation supports two modes: non-streaming (returns a complete ChatCompletion object) and streaming (returns a Stream<ChatCompletionChunk> for incremental delivery). The mode is determined by the stream parameter in the request body.

Usage

Use this principle whenever you need to generate text from an OpenAI model. This is the fundamental building block for chatbots, content generation, code generation, and any other text-generation task.

Theoretical Basis

The invocation follows a Request-Response pattern with discriminated return types:

function invoke(params):
    if params.stream:
        return Stream<Chunk>  // Server-Sent Events
    else:
        return APIPromise<Completion>  // Single JSON response

    // HTTP: POST /chat/completions
    // Body: JSON-serialized params
    // Auth: Bearer token from client

The APIPromise wrapper provides lazy execution — the HTTP request is deferred until .then() or await is called, allowing the caller to attach .withResponse() for header access.

Related Pages

Implemented By

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment