test_prompt_cache_key_invoke() — langchain Function Reference
Architecture documentation for the test_prompt_cache_key_invoke() function in test_base.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 703b5de4_fffd_816e_569b_1950957d13df["test_prompt_cache_key_invoke()"] bd382a4e_442c_13ae_530c_6e34bc43623d["test_base.py"] 703b5de4_fffd_816e_569b_1950957d13df -->|defined in| bd382a4e_442c_13ae_530c_6e34bc43623d style 703b5de4_fffd_816e_569b_1950957d13df fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/openai/tests/integration_tests/chat_models/test_base.py lines 1176–1193
def test_prompt_cache_key_invoke() -> None:
"""Test that `prompt_cache_key` works with invoke calls."""
chat = ChatOpenAI(model="gpt-5-nano", max_completion_tokens=500)
messages = [HumanMessage("Say hello")]
# Test that invoke works with prompt_cache_key parameter
response = chat.invoke(messages, prompt_cache_key="integration-test-v1")
assert isinstance(response, AIMessage)
assert isinstance(response.content, str)
assert len(response.content) > 0
# Test that subsequent call with same cache key also works
response2 = chat.invoke(messages, prompt_cache_key="integration-test-v1")
assert isinstance(response2, AIMessage)
assert isinstance(response2.content, str)
assert len(response2.content) > 0
Domain
Subdomains
Source
Frequently Asked Questions
What does test_prompt_cache_key_invoke() do?
test_prompt_cache_key_invoke() is a function in the langchain codebase, defined in libs/partners/openai/tests/integration_tests/chat_models/test_base.py.
Where is test_prompt_cache_key_invoke() defined?
test_prompt_cache_key_invoke() is defined in libs/partners/openai/tests/integration_tests/chat_models/test_base.py at line 1176.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free