Home / Function/ test_prompt_cache_key_parameter_inclusion() — langchain Function Reference

test_prompt_cache_key_parameter_inclusion() — langchain Function Reference

Architecture documentation for the test_prompt_cache_key_parameter_inclusion() function in test_prompt_cache_key.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  b091febb_3c6a_5031_dfd3_1b1d69e0350a["test_prompt_cache_key_parameter_inclusion()"]
  adfe4542_d9fc_113e_aad7_23a86789157a["test_prompt_cache_key.py"]
  b091febb_3c6a_5031_dfd3_1b1d69e0350a -->|defined in| adfe4542_d9fc_113e_aad7_23a86789157a
  style b091febb_3c6a_5031_dfd3_1b1d69e0350a fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

libs/partners/openai/tests/unit_tests/chat_models/test_prompt_cache_key.py lines 8–15

def test_prompt_cache_key_parameter_inclusion() -> None:
    """Test that prompt_cache_key parameter is properly included in request payload."""
    chat = ChatOpenAI(model="gpt-4o-mini", max_completion_tokens=10)
    messages = [HumanMessage("Hello")]

    payload = chat._get_request_payload(messages, prompt_cache_key="test-cache-key")
    assert "prompt_cache_key" in payload
    assert payload["prompt_cache_key"] == "test-cache-key"

Domain

Subdomains

Frequently Asked Questions

What does test_prompt_cache_key_parameter_inclusion() do?
test_prompt_cache_key_parameter_inclusion() is a function in the langchain codebase, defined in libs/partners/openai/tests/unit_tests/chat_models/test_prompt_cache_key.py.
Where is test_prompt_cache_key_parameter_inclusion() defined?
test_prompt_cache_key_parameter_inclusion() is defined in libs/partners/openai/tests/unit_tests/chat_models/test_prompt_cache_key.py at line 8.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free