test_response_metadata_streaming() — langchain Function Reference
Architecture documentation for the test_response_metadata_streaming() function in test_base.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD f9f5bda0_7aa0_4cd0_cbdf_6d323250fd89["test_response_metadata_streaming()"] bd382a4e_442c_13ae_530c_6e34bc43623d["test_base.py"] f9f5bda0_7aa0_4cd0_cbdf_6d323250fd89 -->|defined in| bd382a4e_442c_13ae_530c_6e34bc43623d style f9f5bda0_7aa0_4cd0_cbdf_6d323250fd89 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/openai/tests/integration_tests/chat_models/test_base.py lines 463–473
def test_response_metadata_streaming() -> None:
llm = ChatOpenAI()
full: BaseMessageChunk | None = None
for chunk in llm.stream("I'm Pickle Rick", logprobs=True):
assert isinstance(chunk.content, str)
full = chunk if full is None else full + chunk
assert all(
k in cast(BaseMessageChunk, full).response_metadata
for k in ("logprobs", "finish_reason", "service_tier")
)
assert "content" in cast(BaseMessageChunk, full).response_metadata["logprobs"]
Domain
Subdomains
Source
Frequently Asked Questions
What does test_response_metadata_streaming() do?
test_response_metadata_streaming() is a function in the langchain codebase, defined in libs/partners/openai/tests/integration_tests/chat_models/test_base.py.
Where is test_response_metadata_streaming() defined?
test_response_metadata_streaming() is defined in libs/partners/openai/tests/integration_tests/chat_models/test_base.py at line 463.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free