test_glm4_stream() — langchain Function Reference
Architecture documentation for the test_glm4_stream() function in test_base.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 705f8a06_8729_723d_8671_5fccaa4c59f5["test_glm4_stream()"] 48232d20_f8c1_b597_14fa_7dc407e9bfe5["test_base.py"] 705f8a06_8729_723d_8671_5fccaa4c59f5 -->|defined in| 48232d20_f8c1_b597_14fa_7dc407e9bfe5 style 705f8a06_8729_723d_8671_5fccaa4c59f5 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/openai/tests/unit_tests/chat_models/test_base.py lines 421–443
def test_glm4_stream(mock_glm4_completion: list) -> None:
llm_name = "glm-4"
llm = ChatOpenAI(model=llm_name, stream_usage=True)
mock_client = MagicMock()
def mock_create(*args: Any, **kwargs: Any) -> MockSyncContextManager:
return MockSyncContextManager(mock_glm4_completion)
mock_client.create = mock_create
usage_chunk = mock_glm4_completion[-1]
usage_metadata: UsageMetadata | None = None
with patch.object(llm, "client", mock_client):
for chunk in llm.stream("你的名字叫什么?只回答名字"):
assert isinstance(chunk, AIMessageChunk)
if chunk.usage_metadata is not None:
usage_metadata = chunk.usage_metadata
assert usage_metadata is not None
assert usage_metadata["input_tokens"] == usage_chunk["usage"]["prompt_tokens"]
assert usage_metadata["output_tokens"] == usage_chunk["usage"]["completion_tokens"]
assert usage_metadata["total_tokens"] == usage_chunk["usage"]["total_tokens"]
Domain
Subdomains
Source
Frequently Asked Questions
What does test_glm4_stream() do?
test_glm4_stream() is a function in the langchain codebase, defined in libs/partners/openai/tests/unit_tests/chat_models/test_base.py.
Where is test_glm4_stream() defined?
test_glm4_stream() is defined in libs/partners/openai/tests/unit_tests/chat_models/test_base.py at line 421.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free