test_batch_size() — langchain Function Reference
Architecture documentation for the test_batch_size() function in test_base.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 337d7973_60b3_88f5_cab6_c04b4a499b37["test_batch_size()"] 8cb88ac4_61d9_baf3_9df4_9b3f5095927e["test_base.py"] 337d7973_60b3_88f5_cab6_c04b4a499b37 -->|defined in| 8cb88ac4_61d9_baf3_9df4_9b3f5095927e style 337d7973_60b3_88f5_cab6_c04b4a499b37 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/core/tests/unit_tests/language_models/chat_models/test_base.py lines 97–118
def test_batch_size(messages: list[BaseMessage], messages_2: list[BaseMessage]) -> None:
# The base endpoint doesn't support native batching,
# so we expect batch_size to always be 1
llm = FakeListChatModel(responses=[str(i) for i in range(100)])
with collect_runs() as cb:
llm.batch([messages, messages_2], {"callbacks": [cb]})
assert len(cb.traced_runs) == 2
assert all((r.extra or {}).get("batch_size") == 1 for r in cb.traced_runs)
with collect_runs() as cb:
llm.batch([messages], {"callbacks": [cb]})
assert all((r.extra or {}).get("batch_size") == 1 for r in cb.traced_runs)
assert len(cb.traced_runs) == 1
with collect_runs() as cb:
llm.invoke(messages)
assert len(cb.traced_runs) == 1
assert (cb.traced_runs[0].extra or {}).get("batch_size") == 1
with collect_runs() as cb:
list(llm.stream(messages))
assert len(cb.traced_runs) == 1
assert (cb.traced_runs[0].extra or {}).get("batch_size") == 1
Domain
Subdomains
Source
Frequently Asked Questions
What does test_batch_size() do?
test_batch_size() is a function in the langchain codebase, defined in libs/core/tests/unit_tests/language_models/chat_models/test_base.py.
Where is test_batch_size() defined?
test_batch_size() is defined in libs/core/tests/unit_tests/language_models/chat_models/test_base.py at line 97.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free