test_async_batch_size() — langchain Function Reference
Architecture documentation for the test_async_batch_size() function in test_base.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 24cbcca0_b8bc_e95c_8a07_32c95da9b178["test_async_batch_size()"] 0cad5588_a6f2_d365_b61b_841ca3437132["test_base.py"] 24cbcca0_b8bc_e95c_8a07_32c95da9b178 -->|defined in| 0cad5588_a6f2_d365_b61b_841ca3437132 style 24cbcca0_b8bc_e95c_8a07_32c95da9b178 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/core/tests/unit_tests/language_models/llms/test_base.py lines 74–97
async def test_async_batch_size() -> None:
llm = FakeListLLM(responses=["foo"] * 3)
with collect_runs() as cb:
await llm.abatch(["foo", "bar", "foo"], {"callbacks": [cb]})
assert all((r.extra or {}).get("batch_size") == 3 for r in cb.traced_runs)
assert len(cb.traced_runs) == 3
llm = FakeListLLM(responses=["foo"])
with collect_runs() as cb:
await llm.abatch(["foo"], {"callbacks": [cb]})
assert all((r.extra or {}).get("batch_size") == 1 for r in cb.traced_runs)
assert len(cb.traced_runs) == 1
llm = FakeListLLM(responses=["foo"])
with collect_runs() as cb:
await llm.ainvoke("foo")
assert len(cb.traced_runs) == 1
assert (cb.traced_runs[0].extra or {}).get("batch_size") == 1
llm = FakeListLLM(responses=["foo"])
with collect_runs() as cb:
async for _ in llm.astream("foo"):
pass
assert len(cb.traced_runs) == 1
assert (cb.traced_runs[0].extra or {}).get("batch_size") == 1
Domain
Subdomains
Source
Frequently Asked Questions
What does test_async_batch_size() do?
test_async_batch_size() is a function in the langchain codebase, defined in libs/core/tests/unit_tests/language_models/llms/test_base.py.
Where is test_async_batch_size() defined?
test_async_batch_size() is defined in libs/core/tests/unit_tests/language_models/llms/test_base.py at line 74.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free