Home / Function/ update_cache() — langchain Function Reference

update_cache() — langchain Function Reference

Architecture documentation for the update_cache() function in llms.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  37c0466f_ed37_9b74_0dad_7d106ca16ce8["update_cache()"]
  a4692bf1_369d_4673_b1eb_6b9a8cbb9994["llms.py"]
  37c0466f_ed37_9b74_0dad_7d106ca16ce8 -->|defined in| a4692bf1_369d_4673_b1eb_6b9a8cbb9994
  3ab24a73_7222_b6b9_1708_d227f0b8a684["generate()"]
  3ab24a73_7222_b6b9_1708_d227f0b8a684 -->|calls| 37c0466f_ed37_9b74_0dad_7d106ca16ce8
  850d0849_18b0_9d36_a562_50a644c67a24["_resolve_cache()"]
  37c0466f_ed37_9b74_0dad_7d106ca16ce8 -->|calls| 850d0849_18b0_9d36_a562_50a644c67a24
  style 37c0466f_ed37_9b74_0dad_7d106ca16ce8 fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

libs/core/langchain_core/language_models/llms.py lines 226–256

def update_cache(
    cache: BaseCache | bool | None,  # noqa: FBT001
    existing_prompts: dict[int, list],
    llm_string: str,
    missing_prompt_idxs: list[int],
    new_results: LLMResult,
    prompts: list[str],
) -> dict | None:
    """Update the cache and get the LLM output.

    Args:
        cache: Cache object.
        existing_prompts: Dictionary of existing prompts.
        llm_string: LLM string.
        missing_prompt_idxs: List of missing prompt indexes.
        new_results: LLMResult object.
        prompts: List of prompts.

    Returns:
        LLM output.

    Raises:
        ValueError: If the cache is not set and cache is True.
    """
    llm_cache = _resolve_cache(cache=cache)
    for i, result in enumerate(new_results.generations):
        existing_prompts[missing_prompt_idxs[i]] = result
        prompt = prompts[missing_prompt_idxs[i]]
        if llm_cache is not None:
            llm_cache.update(prompt, llm_string, result)
    return new_results.llm_output

Domain

Subdomains

Called By

Frequently Asked Questions

What does update_cache() do?
update_cache() is a function in the langchain codebase, defined in libs/core/langchain_core/language_models/llms.py.
Where is update_cache() defined?
update_cache() is defined in libs/core/langchain_core/language_models/llms.py at line 226.
What does update_cache() call?
update_cache() calls 1 function(s): _resolve_cache.
What calls update_cache()?
update_cache() is called by 1 function(s): generate.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free