test_chat_openai_streaming_llm_output_contains_model_name() — langchain Function Reference
Architecture documentation for the test_chat_openai_streaming_llm_output_contains_model_name() function in test_base.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD f01b83bd_b798_0dbf_abbf_58ffcf61e6f6["test_chat_openai_streaming_llm_output_contains_model_name()"] bd382a4e_442c_13ae_530c_6e34bc43623d["test_base.py"] f01b83bd_b798_0dbf_abbf_58ffcf61e6f6 -->|defined in| bd382a4e_442c_13ae_530c_6e34bc43623d style f01b83bd_b798_0dbf_abbf_58ffcf61e6f6 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/openai/tests/integration_tests/chat_models/test_base.py lines 205–211
def test_chat_openai_streaming_llm_output_contains_model_name() -> None:
"""Test llm_output contains model_name."""
chat = ChatOpenAI(max_tokens=MAX_TOKEN_COUNT, streaming=True) # type: ignore[call-arg]
message = HumanMessage(content="Hello")
llm_result = chat.generate([[message]])
assert llm_result.llm_output is not None
assert llm_result.llm_output["model_name"] == chat.model_name
Domain
Subdomains
Source
Frequently Asked Questions
What does test_chat_openai_streaming_llm_output_contains_model_name() do?
test_chat_openai_streaming_llm_output_contains_model_name() is a function in the langchain codebase, defined in libs/partners/openai/tests/integration_tests/chat_models/test_base.py.
Where is test_chat_openai_streaming_llm_output_contains_model_name() defined?
test_chat_openai_streaming_llm_output_contains_model_name() is defined in libs/partners/openai/tests/integration_tests/chat_models/test_base.py at line 205.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free