Home / Function/ test_chat_openai_streaming() — langchain Function Reference

test_chat_openai_streaming() — langchain Function Reference

Architecture documentation for the test_chat_openai_streaming() function in test_base.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  45b94a7b_f626_b369_8d06_c8c359252e23["test_chat_openai_streaming()"]
  bd382a4e_442c_13ae_530c_6e34bc43623d["test_base.py"]
  45b94a7b_f626_b369_8d06_c8c359252e23 -->|defined in| bd382a4e_442c_13ae_530c_6e34bc43623d
  style 45b94a7b_f626_b369_8d06_c8c359252e23 fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

libs/partners/openai/tests/integration_tests/chat_models/test_base.py lines 158–173

def test_chat_openai_streaming(use_responses_api: bool) -> None:
    """Test that streaming correctly invokes on_llm_new_token callback."""
    callback_handler = FakeCallbackHandler()
    callback_manager = CallbackManager([callback_handler])
    chat = ChatOpenAI(
        max_tokens=MAX_TOKEN_COUNT,  # type: ignore[call-arg]
        streaming=True,
        temperature=0,
        callbacks=callback_manager,
        verbose=True,
        use_responses_api=use_responses_api,
    )
    message = HumanMessage(content="Hello")
    response = chat.invoke([message])
    assert callback_handler.llm_streams > 0
    assert isinstance(response, BaseMessage)

Domain

Subdomains

Frequently Asked Questions

What does test_chat_openai_streaming() do?
test_chat_openai_streaming() is a function in the langchain codebase, defined in libs/partners/openai/tests/integration_tests/chat_models/test_base.py.
Where is test_chat_openai_streaming() defined?
test_chat_openai_streaming() is defined in libs/partners/openai/tests/integration_tests/chat_models/test_base.py at line 158.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free