test_chat_openai_streaming() — langchain Function Reference
Architecture documentation for the test_chat_openai_streaming() function in test_azure.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD cfd81479_eabb_46d3_fba5_81dbf99e9bc9["test_chat_openai_streaming()"] c413d48d_e43d_eae6_47cb_3eea9394c77c["test_azure.py"] cfd81479_eabb_46d3_fba5_81dbf99e9bc9 -->|defined in| c413d48d_e43d_eae6_47cb_3eea9394c77c ccf90d51_5ff2_7b50_5922_b28b80cca584["_get_llm()"] cfd81479_eabb_46d3_fba5_81dbf99e9bc9 -->|calls| ccf90d51_5ff2_7b50_5922_b28b80cca584 style cfd81479_eabb_46d3_fba5_81dbf99e9bc9 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/openai/tests/integration_tests/chat_models/test_azure.py lines 86–100
def test_chat_openai_streaming() -> None:
"""Test that streaming correctly invokes on_llm_new_token callback."""
callback_handler = FakeCallbackHandler()
callback_manager = CallbackManager([callback_handler])
chat = _get_llm(
max_tokens=10,
streaming=True,
temperature=0,
callbacks=callback_manager,
verbose=True,
)
message = HumanMessage(content="Hello")
response = chat.invoke([message])
assert callback_handler.llm_streams > 0
assert isinstance(response, BaseMessage)
Domain
Subdomains
Calls
Source
Frequently Asked Questions
What does test_chat_openai_streaming() do?
test_chat_openai_streaming() is a function in the langchain codebase, defined in libs/partners/openai/tests/integration_tests/chat_models/test_azure.py.
Where is test_chat_openai_streaming() defined?
test_chat_openai_streaming() is defined in libs/partners/openai/tests/integration_tests/chat_models/test_azure.py at line 86.
What does test_chat_openai_streaming() call?
test_chat_openai_streaming() calls 1 function(s): _get_llm.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free