Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Implementation:Langchain ai Langchain BaseChatOpenAI Stream Implementation

From Leeroopedia
Knowledge Sources
Domains Streaming, API_Integration
Last Updated 2026-02-11 00:00 GMT

Overview

Concrete tool for processing OpenAI streaming responses into LangChain chunks provided by the LangChain OpenAI integration.

Description

The BaseChatOpenAI._stream() method calls the OpenAI SDK with stream=True, iterates over the streaming response, and converts each OpenAI chunk into a ChatGenerationChunk containing an AIMessageChunk. It handles content deltas, tool call chunks, and optionally includes usage metadata (when stream_usage=True).

Usage

This is an internal method called by stream() and astream(). Users do not call it directly.

Code Reference

Source Location

  • Repository: langchain
  • File: libs/partners/openai/langchain_openai/chat_models/base.py
  • Lines: L1304-1377

Signature

def _stream(
    self,
    messages: list[BaseMessage],
    stop: list[str] | None = None,
    run_manager: CallbackManagerForLLMRun | None = None,
    *,
    stream_usage: bool | None = None,
    **kwargs: Any,
) -> Iterator[ChatGenerationChunk]:

Import

# Internal method — accessed via ChatOpenAI instance
from langchain_openai import ChatOpenAI

I/O Contract

Inputs

Name Type Required Description
messages list[BaseMessage] Yes Messages to send
stop list[str] or None No Stop sequences
stream_usage bool or None No Include token usage in chunks

Outputs

Name Type Description
return Iterator[ChatGenerationChunk] Chunks with AIMessageChunk containing incremental content and tool_call_chunks

Usage Examples

Observing Stream Chunks

from langchain_openai import ChatOpenAI

llm = ChatOpenAI(model="gpt-4o-mini", stream_usage=True)

for chunk in llm.stream("Hello!"):
    # Each chunk has incremental content
    if chunk.content:
        print(chunk.content, end="")
    # Last chunk may have usage_metadata
    if chunk.usage_metadata:
        print(f"\nTokens: {chunk.usage_metadata}")

Related Pages

Implements Principle

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment