Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Principle:Openai Openai node Client Side Stream Consumption

From Leeroopedia
Knowledge Sources
Domains Streaming, Frontend
Last Updated 2026-02-15 00:00 GMT

Overview

A principle for consuming proxied OpenAI streaming responses on the client side by reconstructing a typed ChatCompletionStream from a fetch response body.

Description

Client-Side Stream Consumption is the browser counterpart to server-side stream proxying. The browser fetches from the server proxy endpoint, gets a ReadableStream response body, and reconstructs a full ChatCompletionStream from it. This reconstruction provides the same typed event interface (content.delta, content.done, chunk) as the server-side stream, enabling consistent consumption patterns.

The browser-side stream supports async iteration, event listeners, and content accumulation — all without requiring an API key on the client.

Usage

Use this principle in browser applications that consume streaming responses from a backend proxy. The fromReadableStream factory method handles deserialization and event dispatch.

Theoretical Basis

Client-side consumption follows a Reconstruction Pattern:

// Browser-side:
// 1. Fetch from server proxy
response = await fetch('/api/chat', { method: 'POST', body: ... })

// 2. Reconstruct ChatCompletionStream from response body
stream = ChatCompletionStream.fromReadableStream(response.body)

// 3. Consume with same interface as server-side
stream.on('content.delta', ({ delta }) => display(delta))

// The ReadableStream is parsed:
// - Read UTF-8 bytes
// - Split on newlines
// - JSON.parse each line back to ChatCompletionChunk
// - Dispatch typed events

Related Pages

Implemented By

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment