test_stream_response_format() — langchain Function Reference
Architecture documentation for the test_stream_response_format() function in test_azure.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 2146444e_1643_15ef_9ae5_f35122e634dc["test_stream_response_format()"] c413d48d_e43d_eae6_47cb_3eea9394c77c["test_azure.py"] 2146444e_1643_15ef_9ae5_f35122e634dc -->|defined in| c413d48d_e43d_eae6_47cb_3eea9394c77c style 2146444e_1643_15ef_9ae5_f35122e634dc fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/openai/tests/integration_tests/chat_models/test_azure.py lines 274–286
def test_stream_response_format(llm: AzureChatOpenAI) -> None:
full: BaseMessageChunk | None = None
chunks = []
for chunk in llm.stream("how are ya", response_format=Foo):
chunks.append(chunk)
full = chunk if full is None else full + chunk
assert len(chunks) > 1
assert isinstance(full, AIMessageChunk)
parsed = full.additional_kwargs["parsed"]
assert isinstance(parsed, Foo)
assert isinstance(full.content, str)
parsed_content = json.loads(full.content)
assert parsed.response == parsed_content["response"]
Domain
Subdomains
Source
Frequently Asked Questions
What does test_stream_response_format() do?
test_stream_response_format() is a function in the langchain codebase, defined in libs/partners/openai/tests/integration_tests/chat_models/test_azure.py.
Where is test_stream_response_format() defined?
test_stream_response_format() is defined in libs/partners/openai/tests/integration_tests/chat_models/test_azure.py at line 274.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free