Home / Function/ get_prompts() — langchain Function Reference

get_prompts() — langchain Function Reference

Architecture documentation for the get_prompts() function in llms.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  9ba11b11_f2bb_a080_89d9_77a37ac38030["get_prompts()"]
  a4692bf1_369d_4673_b1eb_6b9a8cbb9994["llms.py"]
  9ba11b11_f2bb_a080_89d9_77a37ac38030 -->|defined in| a4692bf1_369d_4673_b1eb_6b9a8cbb9994
  3ab24a73_7222_b6b9_1708_d227f0b8a684["generate()"]
  3ab24a73_7222_b6b9_1708_d227f0b8a684 -->|calls| 9ba11b11_f2bb_a080_89d9_77a37ac38030
  850d0849_18b0_9d36_a562_50a644c67a24["_resolve_cache()"]
  9ba11b11_f2bb_a080_89d9_77a37ac38030 -->|calls| 850d0849_18b0_9d36_a562_50a644c67a24
  style 9ba11b11_f2bb_a080_89d9_77a37ac38030 fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

libs/core/langchain_core/language_models/llms.py lines 155–188

def get_prompts(
    params: dict[str, Any],
    prompts: list[str],
    cache: BaseCache | bool | None = None,  # noqa: FBT001
) -> tuple[dict[int, list], str, list[int], list[str]]:
    """Get prompts that are already cached.

    Args:
        params: Dictionary of parameters.
        prompts: List of prompts.
        cache: Cache object.

    Returns:
        A tuple of existing prompts, llm_string, missing prompt indexes,
            and missing prompts.

    Raises:
        ValueError: If the cache is not set and cache is True.
    """
    llm_string = str(sorted(params.items()))
    missing_prompts = []
    missing_prompt_idxs = []
    existing_prompts = {}

    llm_cache = _resolve_cache(cache=cache)
    for i, prompt in enumerate(prompts):
        if llm_cache:
            cache_val = llm_cache.lookup(prompt, llm_string)
            if isinstance(cache_val, list):
                existing_prompts[i] = cache_val
            else:
                missing_prompts.append(prompt)
                missing_prompt_idxs.append(i)
    return existing_prompts, llm_string, missing_prompt_idxs, missing_prompts

Domain

Subdomains

Called By

Frequently Asked Questions

What does get_prompts() do?
get_prompts() is a function in the langchain codebase, defined in libs/core/langchain_core/language_models/llms.py.
Where is get_prompts() defined?
get_prompts() is defined in libs/core/langchain_core/language_models/llms.py at line 155.
What does get_prompts() call?
get_prompts() calls 1 function(s): _resolve_cache.
What calls get_prompts()?
get_prompts() is called by 1 function(s): generate.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free