Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Implementation:Openai Openai node FromReadableStream

From Leeroopedia
Knowledge Sources
Domains Streaming, Frontend
Last Updated 2026-02-15 00:00 GMT

Overview

Concrete tool for reconstructing a ChatCompletionStream from a browser fetch response body provided by the openai-node SDK.

Description

The ChatCompletionStream.fromReadableStream() static factory method takes a ReadableStream (from a browser fetch() response body) and reconstructs a fully functional ChatCompletionStream. It internally creates a Stream<ChatCompletionChunk> that parses JSON lines back into typed objects, then feeds them through the same event dispatch pipeline as a server-side stream.

The resulting stream supports all the same events (content.delta, content.done, chunk), async iteration, and finalChatCompletion().

Usage

Call ChatCompletionStream.fromReadableStream(response.body) in browser code after fetching from a server proxy endpoint that uses toReadableStream().

Code Reference

Source Location

  • Repository: openai-node
  • File: src/lib/ChatCompletionStream.ts (fromReadableStream), src/core/streaming.ts (Stream.fromReadableStream base)
  • Lines: ChatCompletionStream.ts:L154-158, ChatCompletionStream.ts:L397-423 (_fromReadableStream), streaming.ts:L107-152

Signature

class ChatCompletionStream<ParsedT> {
  static fromReadableStream(
    stream: ReadableStream,
  ): ChatCompletionStream<null>;
}

Import

import { ChatCompletionStream } from 'openai/lib/ChatCompletionStream';

I/O Contract

Inputs

Name Type Required Description
stream ReadableStream Yes Readable stream from fetch response.body

Outputs

Name Type Description
chatCompletionStream ChatCompletionStream<null> Reconstructed stream with typed events, async iteration, and finalChatCompletion()

Usage Examples

Browser-Side Consumption

import { ChatCompletionStream } from 'openai/lib/ChatCompletionStream';

// Fetch from server proxy
const response = await fetch('/api/chat', {
  method: 'POST',
  headers: { 'Content-Type': 'application/json' },
  body: JSON.stringify({
    messages: [{ role: 'user', content: 'Hello!' }],
  }),
});

// Reconstruct stream
const stream = ChatCompletionStream.fromReadableStream(response.body!);

// Use same event interface as server-side
stream.on('content.delta', ({ delta }) => {
  document.getElementById('output')!.textContent += delta;
});

stream.on('content.done', () => {
  console.log('Stream complete');
});

// Or use async iteration
for await (const chunk of stream) {
  const content = chunk.choices[0]?.delta?.content;
  if (content) {
    document.getElementById('output')!.textContent += content;
  }
}

Related Pages

Implements Principle

Requires Environment

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment