test_global_cache_abatch() — langchain Function Reference
Architecture documentation for the test_global_cache_abatch() function in test_cache.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD ecaff70b_7abc_3a6d_6b7e_9f2b46f3685d["test_global_cache_abatch()"] 51f634bf_713d_3f19_d694_5c6ef3e59c57["test_cache.py"] ecaff70b_7abc_3a6d_6b7e_9f2b46f3685d -->|defined in| 51f634bf_713d_3f19_d694_5c6ef3e59c57 style ecaff70b_7abc_3a6d_6b7e_9f2b46f3685d fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/core/tests/unit_tests/language_models/chat_models/test_cache.py lines 190–214
async def test_global_cache_abatch() -> None:
global_cache = InMemoryCache()
try:
set_llm_cache(global_cache)
chat_model = FakeListChatModel(
cache=True, responses=["hello", "goodbye", "meow", "woof"]
)
results = await chat_model.abatch(["first prompt", "second prompt"])
assert results[0].content == "hello"
assert results[1].content == "goodbye"
# Now try with the same prompt
results = await chat_model.abatch(["first prompt", "first prompt"])
assert results[0].content == "hello"
assert results[1].content == "hello"
global_cache = InMemoryCache()
set_llm_cache(global_cache)
assert global_cache._cache == {}
results = await chat_model.abatch(["prompt", "prompt"])
assert results[0].content == "meow"
assert results[1].content == "meow"
finally:
set_llm_cache(None)
Domain
Subdomains
Source
Frequently Asked Questions
What does test_global_cache_abatch() do?
test_global_cache_abatch() is a function in the langchain codebase, defined in libs/core/tests/unit_tests/language_models/chat_models/test_cache.py.
Where is test_global_cache_abatch() defined?
test_global_cache_abatch() is defined in libs/core/tests/unit_tests/language_models/chat_models/test_cache.py at line 190.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free