Implementation:Mistralai Client python EventStream Context
| Knowledge Sources | |
|---|---|
| Domains | Streaming, Resource_Management |
| Last Updated | 2026-02-15 14:00 GMT |
Overview
Concrete tool for managing streaming connection lifecycle using context manager protocol provided by EventStream classes.
Description
The EventStream.__exit__() and EventStreamAsync.__aexit__() methods handle connection cleanup when the stream context manager exits. __exit__ calls self.response.close() to synchronously close the httpx response and release the underlying socket. __aexit__ calls await self.response.aclose() for async cleanup. These methods are invoked automatically by the with / async with statement, ensuring resource cleanup even on exception.
Usage
This is automatically invoked when exiting a with / async with block that wraps a streaming response. No manual invocation needed — just ensure you always use the context manager pattern.
Code Reference
Source Location
- Repository: client-python
- File: src/mistralai/client/utils/eventstreaming.py
- Lines: L43-47 (sync __exit__), L74-78 (async __aexit__)
Signature
class EventStream(Generic[T]):
def __exit__(self, exc_type, exc_val, exc_tb) -> None:
self.response.close()
class EventStreamAsync(Generic[T]):
async def __aexit__(self, exc_type, exc_val, exc_tb) -> None:
await self.response.aclose()
Import
# Used implicitly via context managers
# with client.chat.stream(...) as stream:
# ... # __exit__ called automatically
I/O Contract
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
| EventStream instance | EventStream / EventStreamAsync | Yes | The active stream to finalize |
Outputs
| Name | Type | Description |
|---|---|---|
| (side effect) | None | HTTP connection closed, resources released |
Usage Examples
Automatic Cleanup with Context Manager
# The 'with' statement ensures __exit__ is called
with client.chat.stream(
model="mistral-large-latest",
messages=[UserMessage(content="Hello")],
) as stream:
for chunk in stream:
print(chunk.data.choices[0].delta.content, end="")
# Connection automatically closed here
# Async version
async with client.chat.stream_async(
model="mistral-large-latest",
messages=[UserMessage(content="Hello")],
) as stream:
async for chunk in stream:
print(chunk.data.choices[0].delta.content, end="")
# Connection automatically closed here