Implementation:Openai Openai node ToReadableStream
| Knowledge Sources | |
|---|---|
| Domains | Streaming, Server_Architecture |
| Last Updated | 2026-02-15 00:00 GMT |
Overview
Concrete tool for converting ChatCompletionStream to a Web ReadableStream for HTTP proxying provided by the openai-node SDK.
Description
The toReadableStream() method on ChatCompletionStream converts the async iterable stream into a Web Streams API ReadableStream. Each chunk from the OpenAI stream is serialized to JSON, appended with a newline, and encoded as UTF-8 bytes. This ReadableStream can be used directly as an HTTP response body.
The base implementation lives in Stream.toReadableStream() (src/core/streaming.ts), with ChatCompletionStream delegating via its EventStream base class.
Usage
Call .toReadableStream() on a ChatCompletionStream instance to get a ReadableStream suitable for HTTP response bodies. Set appropriate response headers (Content-Type: text/event-stream or application/octet-stream).
Code Reference
Source Location
- Repository: openai-node
- File: src/lib/ChatCompletionStream.ts (ChatCompletionStream.toReadableStream), src/core/streaming.ts (Stream.toReadableStream base)
- Lines: ChatCompletionStream.ts:L603-606, streaming.ts:L191-215
Signature
// On ChatCompletionStream
class ChatCompletionStream<ParsedT> {
toReadableStream(): ReadableStream;
}
// Base implementation on Stream
class Stream<Item> {
toReadableStream(): ReadableStream {
// Creates ReadableStream where each chunk is:
// new TextEncoder().encode(JSON.stringify(item) + '\n')
}
}
Import
import OpenAI from 'openai';
// Access via: stream.toReadableStream()
I/O Contract
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
| (method on) | ChatCompletionStream | Yes | Active stream from client.chat.completions.stream() |
Outputs
| Name | Type | Description |
|---|---|---|
| readableStream | ReadableStream | Web Streams ReadableStream of UTF-8 encoded JSON lines |
Usage Examples
Express Server Proxy
import OpenAI from 'openai';
import express from 'express';
const app = express();
const client = new OpenAI();
app.post('/api/chat', async (req, res) => {
const stream = client.chat.completions.stream({
model: 'gpt-4o',
messages: req.body.messages,
});
res.setHeader('Content-Type', 'text/event-stream');
// Convert to ReadableStream and pipe to response
const readableStream = stream.toReadableStream();
// Use Web Streams pipeTo or manual piping
for await (const chunk of readableStream) {
res.write(chunk);
}
res.end();
});
Next.js Route Handler
import OpenAI from 'openai';
const client = new OpenAI();
export async function POST(req: Request) {
const { messages } = await req.json();
const stream = client.chat.completions.stream({
model: 'gpt-4o',
messages,
});
return new Response(stream.toReadableStream());
}