Home / Function/ test_chat_openai_streaming_generation_info() — langchain Function Reference

test_chat_openai_streaming_generation_info() — langchain Function Reference

Architecture documentation for the test_chat_openai_streaming_generation_info() function in test_azure.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  9d924a95_2652_a546_2242_dd200471a820["test_chat_openai_streaming_generation_info()"]
  c413d48d_e43d_eae6_47cb_3eea9394c77c["test_azure.py"]
  9d924a95_2652_a546_2242_dd200471a820 -->|defined in| c413d48d_e43d_eae6_47cb_3eea9394c77c
  ecd4f9fe_54c4_9db4_d6bc_28d6a1ef2f07["on_llm_end()"]
  9d924a95_2652_a546_2242_dd200471a820 -->|calls| ecd4f9fe_54c4_9db4_d6bc_28d6a1ef2f07
  ccf90d51_5ff2_7b50_5922_b28b80cca584["_get_llm()"]
  9d924a95_2652_a546_2242_dd200471a820 -->|calls| ccf90d51_5ff2_7b50_5922_b28b80cca584
  style 9d924a95_2652_a546_2242_dd200471a820 fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

libs/partners/openai/tests/integration_tests/chat_models/test_azure.py lines 104–120

def test_chat_openai_streaming_generation_info() -> None:
    """Test that generation info is preserved when streaming."""

    class _FakeCallback(FakeCallbackHandler):
        saved_things: dict = {}

        def on_llm_end(self, *args: Any, **kwargs: Any) -> Any:
            # Save the generation
            self.saved_things["generation"] = args[0]

    callback = _FakeCallback()
    callback_manager = CallbackManager([callback])
    chat = _get_llm(max_tokens=2, temperature=0, callbacks=callback_manager)
    list(chat.stream("hi"))
    generation = callback.saved_things["generation"]
    # `Hello!` is two tokens, assert that is what is returned
    assert generation.generations[0][0].text == "Hello!"

Domain

Subdomains

Frequently Asked Questions

What does test_chat_openai_streaming_generation_info() do?
test_chat_openai_streaming_generation_info() is a function in the langchain codebase, defined in libs/partners/openai/tests/integration_tests/chat_models/test_azure.py.
Where is test_chat_openai_streaming_generation_info() defined?
test_chat_openai_streaming_generation_info() is defined in libs/partners/openai/tests/integration_tests/chat_models/test_azure.py at line 104.
What does test_chat_openai_streaming_generation_info() call?
test_chat_openai_streaming_generation_info() calls 2 function(s): _get_llm, on_llm_end.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free