test_output_version_astream() — langchain Function Reference
Architecture documentation for the test_output_version_astream() function in test_base.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 2679d8c2_a8c1_68d5_f202_73c43355bc8d["test_output_version_astream()"] 8cb88ac4_61d9_baf3_9df4_9b3f5095927e["test_base.py"] 2679d8c2_a8c1_68d5_f202_73c43355bc8d -->|defined in| 8cb88ac4_61d9_baf3_9df4_9b3f5095927e style 2679d8c2_a8c1_68d5_f202_73c43355bc8d fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/core/tests/unit_tests/language_models/chat_models/test_base.py lines 1063–1159
async def test_output_version_astream(monkeypatch: Any) -> None:
messages = [AIMessage("foo bar")]
# v0
llm = GenericFakeChatModel(messages=iter(messages))
full = None
async for chunk in llm.astream("hello"):
assert isinstance(chunk, AIMessageChunk)
assert isinstance(chunk.content, str)
assert chunk.content
full = chunk if full is None else full + chunk
assert isinstance(full, AIMessageChunk)
assert full.content == "foo bar"
# v1
llm = GenericFakeChatModel(messages=iter(messages), output_version="v1")
full_v1: AIMessageChunk | None = None
async for chunk in llm.astream("hello"):
assert isinstance(chunk, AIMessageChunk)
assert isinstance(chunk.content, list)
assert len(chunk.content) == 1
block = chunk.content[0]
assert isinstance(block, dict)
assert block["type"] == "text"
assert block["text"]
full_v1 = chunk if full_v1 is None else full_v1 + chunk
assert isinstance(full_v1, AIMessageChunk)
assert full_v1.response_metadata["output_version"] == "v1"
assert full_v1.content == [{"type": "text", "text": "foo bar", "index": 0}]
# Test text blocks
llm_with_rich_content = _AnotherFakeChatModel(
responses=iter([]),
chunks=iter(
[
AIMessageChunk(content="foo "),
AIMessageChunk(content="bar"),
]
),
output_version="v1",
)
full_v1 = None
async for chunk in llm_with_rich_content.astream("hello"):
full_v1 = chunk if full_v1 is None else full_v1 + chunk
assert isinstance(full_v1, AIMessageChunk)
assert full_v1.content_blocks == [{"type": "text", "text": "foo bar", "index": 0}]
# Test content blocks of different types
chunks = [
AIMessageChunk(content="", additional_kwargs={"reasoning_content": "<rea"}),
AIMessageChunk(content="", additional_kwargs={"reasoning_content": "soning>"}),
AIMessageChunk(content="<some "),
AIMessageChunk(content="text>"),
]
llm_with_rich_content = _AnotherFakeChatModel(
responses=iter([]),
chunks=iter(chunks),
output_version="v1",
)
full_v1 = None
async for chunk in llm_with_rich_content.astream("hello"):
full_v1 = chunk if full_v1 is None else full_v1 + chunk
assert isinstance(full_v1, AIMessageChunk)
assert full_v1.content_blocks == [
{"type": "reasoning", "reasoning": "<reasoning>", "index": 0},
{"type": "text", "text": "<some text>", "index": 1},
]
# Test invoke with stream=True
llm_with_rich_content = _AnotherFakeChatModel(
responses=iter([]),
chunks=iter(chunks),
output_version="v1",
)
response_v1 = await llm_with_rich_content.ainvoke("hello", stream=True)
assert response_v1.content_blocks == [
{"type": "reasoning", "reasoning": "<reasoning>", "index": 0},
{"type": "text", "text": "<some text>", "index": 1},
]
Domain
Subdomains
Source
Frequently Asked Questions
What does test_output_version_astream() do?
test_output_version_astream() is a function in the langchain codebase, defined in libs/core/tests/unit_tests/language_models/chat_models/test_base.py.
Where is test_output_version_astream() defined?
test_output_version_astream() is defined in libs/core/tests/unit_tests/language_models/chat_models/test_base.py at line 1063.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free