API Reference: Streaming
udspy.streaming
Streaming support with event queue for incremental LLM outputs and tool updates.
Classes
OutputStreamChunk
Bases: StreamChunk
A chunk of streamed LLM output for a specific field.
Attributes:
| Name | Type | Description |
|---|---|---|
field_name |
str
|
Name of the output field |
delta |
str
|
Incremental content for this field (new text since last chunk) |
content |
str
|
Full accumulated content for this field so far |
is_complete |
bool
|
Whether this field is finished streaming |
Source code in src/udspy/streaming.py
Prediction
Bases: StreamEvent, dict[str, Any]
Final prediction result with attribute access.
This is both a StreamEvent (can be yielded from astream) and a dict (for convenient attribute access to outputs).
Attributes:
| Name | Type | Description |
|---|---|---|
module |
The module that produced this prediction |
|
native_tool_calls |
Tool calls from native LLM response (if any) |
Example
Source code in src/udspy/streaming.py
Attributes
is_final
property
Whether this is the final prediction (no pending tool calls).
StreamChunk
Bases: StreamEvent
A chunk of streamed output from a Module.
Source code in src/udspy/streaming.py
StreamEvent
Base class for all stream events.
Users can define custom event types by inheriting from this class. The only built-in events are OutputStreamChunk and Prediction.
Example
Source code in src/udspy/streaming.py
ThoughtStreamChunk
Bases: StreamChunk
A chunk of streamed reasoning output for a specific step.
Attributes:
| Name | Type | Description |
|---|---|---|
module |
Module
|
The module emitting this chunk |
delta |
str
|
Incremental content for this step (new text since last chunk) |
content |
str
|
Full accumulated content for this step so far |
is_complete |
bool
|
Whether this step is finished streaming |
Source code in src/udspy/streaming.py
Functions
emit_event(event)
Emit an event to the active stream.
This can be called from anywhere (tools, callbacks, etc.) to inject events into the current streaming context. If no stream is active, this is a no-op (silently ignored).
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
event
|
StreamEvent
|
The event to emit (any subclass of StreamEvent) |
required |
Example
from udspy.streaming import emit_event, StreamEvent
from dataclasses import dataclass
@dataclass
class ToolStatus(StreamEvent):
message: str
async def my_tool():
emit_event(ToolStatus("Starting search..."))
result = await do_search()
emit_event(ToolStatus("Search complete"))
return result
# In the stream consumer:
async for event in predictor.astream(question="..."):
if isinstance(event, ToolStatus):
print(f"📊 {event.message}")
elif isinstance(event, OutputStreamChunk):
print(event.delta, end="", flush=True)