Home / Function/ test_cache_key_ignores_message_id_sync() — langchain Function Reference

test_cache_key_ignores_message_id_sync() — langchain Function Reference

Architecture documentation for the test_cache_key_ignores_message_id_sync() function in test_cache.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  680379ac_a652_03f2_3c3c_ce95a4b65ee6["test_cache_key_ignores_message_id_sync()"]
  51f634bf_713d_3f19_d694_5c6ef3e59c57["test_cache.py"]
  680379ac_a652_03f2_3c3c_ce95a4b65ee6 -->|defined in| 51f634bf_713d_3f19_d694_5c6ef3e59c57
  style 680379ac_a652_03f2_3c3c_ce95a4b65ee6 fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

libs/core/tests/unit_tests/language_models/chat_models/test_cache.py lines 480–506

def test_cache_key_ignores_message_id_sync() -> None:
    """Test that message IDs are stripped from cache keys (sync).

    Functionally identical messages with different IDs should produce
    the same cache key and result in cache hits.
    """
    local_cache = InMemoryCache()
    model = FakeListChatModel(cache=local_cache, responses=["hello", "goodbye"])

    # First call with a message that has an ID
    msg_with_id_1 = HumanMessage(content="How are you?", id="unique-id-1")
    result_1 = model.invoke([msg_with_id_1])
    assert result_1.content == "hello"

    # Second call with the same content but different ID should hit cache
    msg_with_id_2 = HumanMessage(content="How are you?", id="unique-id-2")
    result_2 = model.invoke([msg_with_id_2])
    # Should get cached response, not "goodbye"
    assert result_2.content == "hello"

    # Third call with no ID should also hit cache
    msg_no_id = HumanMessage(content="How are you?")
    result_3 = model.invoke([msg_no_id])
    assert result_3.content == "hello"

    # Verify only one cache entry exists
    assert len(local_cache._cache) == 1

Subdomains

Frequently Asked Questions

What does test_cache_key_ignores_message_id_sync() do?
test_cache_key_ignores_message_id_sync() is a function in the langchain codebase, defined in libs/core/tests/unit_tests/language_models/chat_models/test_cache.py.
Where is test_cache_key_ignores_message_id_sync() defined?
test_cache_key_ignores_message_id_sync() is defined in libs/core/tests/unit_tests/language_models/chat_models/test_cache.py at line 480.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free