test_astream_response_format() — langchain Function Reference
Architecture documentation for the test_astream_response_format() function in test_azure.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD c44823cf_b22d_b3da_c437_eaa69bd25db3["test_astream_response_format()"] c413d48d_e43d_eae6_47cb_3eea9394c77c["test_azure.py"] c44823cf_b22d_b3da_c437_eaa69bd25db3 -->|defined in| c413d48d_e43d_eae6_47cb_3eea9394c77c style c44823cf_b22d_b3da_c437_eaa69bd25db3 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/openai/tests/integration_tests/chat_models/test_azure.py lines 289–301
async def test_astream_response_format(llm: AzureChatOpenAI) -> None:
full: BaseMessageChunk | None = None
chunks = []
async for chunk in llm.astream("how are ya", response_format=Foo):
chunks.append(chunk)
full = chunk if full is None else full + chunk
assert len(chunks) > 1
assert isinstance(full, AIMessageChunk)
parsed = full.additional_kwargs["parsed"]
assert isinstance(parsed, Foo)
assert isinstance(full.content, str)
parsed_content = json.loads(full.content)
assert parsed.response == parsed_content["response"]
Domain
Subdomains
Source
Frequently Asked Questions
What does test_astream_response_format() do?
test_astream_response_format() is a function in the langchain codebase, defined in libs/partners/openai/tests/integration_tests/chat_models/test_azure.py.
Where is test_astream_response_format() defined?
test_astream_response_format() is defined in libs/partners/openai/tests/integration_tests/chat_models/test_azure.py at line 289.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free