test_async_response_metadata_streaming() — langchain Function Reference
Architecture documentation for the test_async_response_metadata_streaming() function in test_base.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD fdbe84a9_d03e_ee00_5b56_b145eb181ab0["test_async_response_metadata_streaming()"] bd382a4e_442c_13ae_530c_6e34bc43623d["test_base.py"] fdbe84a9_d03e_ee00_5b56_b145eb181ab0 -->|defined in| bd382a4e_442c_13ae_530c_6e34bc43623d style fdbe84a9_d03e_ee00_5b56_b145eb181ab0 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/openai/tests/integration_tests/chat_models/test_base.py lines 476–486
async def test_async_response_metadata_streaming() -> None:
llm = ChatOpenAI()
full: BaseMessageChunk | None = None
async for chunk in llm.astream("I'm Pickle Rick", logprobs=True):
assert isinstance(chunk.content, str)
full = chunk if full is None else full + chunk
assert all(
k in cast(BaseMessageChunk, full).response_metadata
for k in ("logprobs", "finish_reason", "service_tier")
)
assert "content" in cast(BaseMessageChunk, full).response_metadata["logprobs"]
Domain
Subdomains
Source
Frequently Asked Questions
What does test_async_response_metadata_streaming() do?
test_async_response_metadata_streaming() is a function in the langchain codebase, defined in libs/partners/openai/tests/integration_tests/chat_models/test_base.py.
Where is test_async_response_metadata_streaming() defined?
test_async_response_metadata_streaming() is defined in libs/partners/openai/tests/integration_tests/chat_models/test_base.py at line 476.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free