set_llm_cache() — langchain Function Reference
Architecture documentation for the set_llm_cache() function in globals.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 30582445_ace8_ac6c_1530_15fdcd4f87a3["set_llm_cache()"] bbdcb3a7_3999_113d_5595_dc3cb513799c["globals.py"] 30582445_ace8_ac6c_1530_15fdcd4f87a3 -->|defined in| bbdcb3a7_3999_113d_5595_dc3cb513799c style 30582445_ace8_ac6c_1530_15fdcd4f87a3 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/core/langchain_core/globals.py lines 56–63
def set_llm_cache(value: Optional["BaseCache"]) -> None:
"""Set a new LLM cache, overwriting the previous value, if any.
Args:
value: The new LLM cache to use. If `None`, the LLM cache is disabled.
"""
global _llm_cache # noqa: PLW0603
_llm_cache = value
Domain
Subdomains
Defined In
Source
Frequently Asked Questions
What does set_llm_cache() do?
set_llm_cache() is a function in the langchain codebase, defined in libs/core/langchain_core/globals.py.
Where is set_llm_cache() defined?
set_llm_cache() is defined in libs/core/langchain_core/globals.py at line 56.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free