test_prompt_cache_key_responses_api() — langchain Function Reference
Architecture documentation for the test_prompt_cache_key_responses_api() function in test_prompt_cache_key.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 9a9891bd_20df_18a7_75c2_67460c0ee9b5["test_prompt_cache_key_responses_api()"] adfe4542_d9fc_113e_aad7_23a86789157a["test_prompt_cache_key.py"] 9a9891bd_20df_18a7_75c2_67460c0ee9b5 -->|defined in| adfe4542_d9fc_113e_aad7_23a86789157a style 9a9891bd_20df_18a7_75c2_67460c0ee9b5 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/openai/tests/unit_tests/chat_models/test_prompt_cache_key.py lines 71–87
def test_prompt_cache_key_responses_api() -> None:
"""Test that prompt_cache_key works with Responses API."""
chat = ChatOpenAI(
model="gpt-4o-mini",
use_responses_api=True,
output_version="responses/v1",
max_completion_tokens=10,
)
messages = [HumanMessage("Hello")]
payload = chat._get_request_payload(
messages, prompt_cache_key="responses-api-cache-v1"
)
# prompt_cache_key should be present regardless of API type
assert "prompt_cache_key" in payload
assert payload["prompt_cache_key"] == "responses-api-cache-v1"
Domain
Subdomains
Source
Frequently Asked Questions
What does test_prompt_cache_key_responses_api() do?
test_prompt_cache_key_responses_api() is a function in the langchain codebase, defined in libs/partners/openai/tests/unit_tests/chat_models/test_prompt_cache_key.py.
Where is test_prompt_cache_key_responses_api() defined?
test_prompt_cache_key_responses_api() is defined in libs/partners/openai/tests/unit_tests/chat_models/test_prompt_cache_key.py at line 71.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free