test_stream() — langchain Function Reference
Architecture documentation for the test_stream() function in test_base.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD ca3288cd_8374_af4e_a626_6b07b0be4a5b["test_stream()"] bd382a4e_442c_13ae_530c_6e34bc43623d["test_base.py"] ca3288cd_8374_af4e_a626_6b07b0be4a5b -->|defined in| bd382a4e_442c_13ae_530c_6e34bc43623d style ca3288cd_8374_af4e_a626_6b07b0be4a5b fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/openai/tests/integration_tests/chat_models/test_base.py lines 267–321
def test_stream() -> None:
"""Test streaming tokens from OpenAI."""
llm = ChatOpenAI(
model="gpt-5-nano",
service_tier="flex", # Also test service_tier
max_retries=3, # Add retries for 503 capacity errors
)
full: BaseMessageChunk | None = None
for chunk in llm.stream("I'm Pickle Rick"):
assert isinstance(chunk.content, str)
full = chunk if full is None else full + chunk
assert isinstance(full, AIMessageChunk)
assert full.response_metadata.get("finish_reason") is not None
assert full.response_metadata.get("model_name") is not None
# check token usage
aggregate: BaseMessageChunk | None = None
chunks_with_token_counts = 0
chunks_with_response_metadata = 0
for chunk in llm.stream("Hello"):
assert isinstance(chunk.content, str)
aggregate = chunk if aggregate is None else aggregate + chunk
assert isinstance(chunk, AIMessageChunk)
if chunk.usage_metadata is not None:
chunks_with_token_counts += 1
if chunk.response_metadata and not set(chunk.response_metadata.keys()).issubset(
{"model_provider", "output_version"}
):
chunks_with_response_metadata += 1
if chunks_with_token_counts != 1 or chunks_with_response_metadata != 1:
msg = (
"Expected exactly one chunk with metadata. "
"AIMessageChunk aggregation can add these metadata. Check that "
"this is behaving properly."
)
raise AssertionError(msg)
assert isinstance(aggregate, AIMessageChunk)
assert aggregate.usage_metadata is not None
assert aggregate.usage_metadata["input_tokens"] > 0
assert aggregate.usage_metadata["output_tokens"] > 0
assert aggregate.usage_metadata["total_tokens"] > 0
assert aggregate.usage_metadata.get("input_token_details", {}).get("flex", 0) > 0 # type: ignore[operator]
assert aggregate.usage_metadata.get("output_token_details", {}).get("flex", 0) > 0 # type: ignore[operator]
assert (
aggregate.usage_metadata.get("output_token_details", {}).get( # type: ignore[operator]
"flex_reasoning", 0
)
> 0
)
assert aggregate.usage_metadata.get("output_token_details", {}).get( # type: ignore[operator]
"flex_reasoning", 0
) + aggregate.usage_metadata.get("output_token_details", {}).get(
"flex", 0
) == aggregate.usage_metadata.get("output_tokens")
Domain
Subdomains
Source
Frequently Asked Questions
What does test_stream() do?
test_stream() is a function in the langchain codebase, defined in libs/partners/openai/tests/integration_tests/chat_models/test_base.py.
Where is test_stream() defined?
test_stream() is defined in libs/partners/openai/tests/integration_tests/chat_models/test_base.py at line 267.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free