Home / File/ test_prompt_caching.py — langchain Source File

test_prompt_caching.py — langchain Source File

Architecture documentation for test_prompt_caching.py, a python file in the langchain codebase. 12 imports, 0 dependents.

File python LangChainCore Runnables 12 imports 7 functions 3 classes

Entity Profile

Dependency Diagram

graph LR
  80b9bbd1_b825_9778_a36c_351fbf1d2478["test_prompt_caching.py"]
  f3365e3c_fb7a_bb9a_bc79_059b06cb7024["warnings"]
  80b9bbd1_b825_9778_a36c_351fbf1d2478 --> f3365e3c_fb7a_bb9a_bc79_059b06cb7024
  feec1ec4_6917_867b_d228_b134d0ff8099["typing"]
  80b9bbd1_b825_9778_a36c_351fbf1d2478 --> feec1ec4_6917_867b_d228_b134d0ff8099
  23cb242e_1754_041d_200a_553fcb8abe1b["unittest.mock"]
  80b9bbd1_b825_9778_a36c_351fbf1d2478 --> 23cb242e_1754_041d_200a_553fcb8abe1b
  f69d6389_263d_68a4_7fbf_f14c0602a9ba["pytest"]
  80b9bbd1_b825_9778_a36c_351fbf1d2478 --> f69d6389_263d_68a4_7fbf_f14c0602a9ba
  a681398d_ed44_c914_1a44_5d174223b069["langchain.agents.middleware.types"]
  80b9bbd1_b825_9778_a36c_351fbf1d2478 --> a681398d_ed44_c914_1a44_5d174223b069
  17a62cb3_fefd_6320_b757_b53bb4a1c661["langchain_core.callbacks"]
  80b9bbd1_b825_9778_a36c_351fbf1d2478 --> 17a62cb3_fefd_6320_b757_b53bb4a1c661
  e929cf21_6ab8_6ff3_3765_0d35a099a053["langchain_core.language_models"]
  80b9bbd1_b825_9778_a36c_351fbf1d2478 --> e929cf21_6ab8_6ff3_3765_0d35a099a053
  9444498b_8066_55c7_b3a2_1d90c4162a32["langchain_core.messages"]
  80b9bbd1_b825_9778_a36c_351fbf1d2478 --> 9444498b_8066_55c7_b3a2_1d90c4162a32
  4382dc25_6fba_324a_49e2_e9742d579385["langchain_core.outputs"]
  80b9bbd1_b825_9778_a36c_351fbf1d2478 --> 4382dc25_6fba_324a_49e2_e9742d579385
  e07f6d54_afcc_052d_d33f_8ccdcc46f752["langgraph.runtime"]
  80b9bbd1_b825_9778_a36c_351fbf1d2478 --> e07f6d54_afcc_052d_d33f_8ccdcc46f752
  6d6a73a4_c023_743d_d44b_e288cfcdbc13["langchain_anthropic.chat_models"]
  80b9bbd1_b825_9778_a36c_351fbf1d2478 --> 6d6a73a4_c023_743d_d44b_e288cfcdbc13
  ae698e69_19c6_a424_6f2f_077a7d3fde94["langchain_anthropic.middleware"]
  80b9bbd1_b825_9778_a36c_351fbf1d2478 --> ae698e69_19c6_a424_6f2f_077a7d3fde94
  style 80b9bbd1_b825_9778_a36c_351fbf1d2478 fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

"""Tests for Anthropic prompt caching middleware."""

import warnings
from typing import Any, cast
from unittest.mock import MagicMock

import pytest
from langchain.agents.middleware.types import ModelRequest, ModelResponse
from langchain_core.callbacks import (
    AsyncCallbackManagerForLLMRun,
    CallbackManagerForLLMRun,
)
from langchain_core.language_models import BaseChatModel
from langchain_core.messages import AIMessage, BaseMessage, HumanMessage
from langchain_core.outputs import ChatGeneration, ChatResult
from langgraph.runtime import Runtime

from langchain_anthropic.chat_models import (
    ChatAnthropic,
    _collect_code_execution_tool_ids,
    _is_code_execution_related_block,
)
from langchain_anthropic.middleware import AnthropicPromptCachingMiddleware


class FakeToolCallingModel(BaseChatModel):
    """Fake model for testing middleware."""

    def _generate(
        self,
        messages: list[BaseMessage],
        stop: list[str] | None = None,
        run_manager: CallbackManagerForLLMRun | None = None,
        **kwargs: Any,
    ) -> ChatResult:
        """Top Level call"""
        messages_string = "-".join([str(m.content) for m in messages])
        message = AIMessage(content=messages_string, id="0")
        return ChatResult(generations=[ChatGeneration(message=message)])

    async def _agenerate(
        self,
        messages: list[BaseMessage],
        stop: list[str] | None = None,
        run_manager: AsyncCallbackManagerForLLMRun | None = None,
        **kwargs: Any,
    ) -> ChatResult:
        """Async top level call"""
        messages_string = "-".join([str(m.content) for m in messages])
        message = AIMessage(content=messages_string, id="0")
        return ChatResult(generations=[ChatGeneration(message=message)])

    @property
    def _llm_type(self) -> str:
        return "fake-tool-call-model"


def test_anthropic_prompt_caching_middleware_initialization() -> None:
    """Test AnthropicPromptCachingMiddleware initialization."""
    # Test with custom values
// ... (482 more lines)

Domain

Subdomains

Dependencies

  • langchain.agents.middleware.types
  • langchain_anthropic.chat_models
  • langchain_anthropic.middleware
  • langchain_core.callbacks
  • langchain_core.language_models
  • langchain_core.messages
  • langchain_core.outputs
  • langgraph.runtime
  • pytest
  • typing
  • unittest.mock
  • warnings

Frequently Asked Questions

What does test_prompt_caching.py do?
test_prompt_caching.py is a source file in the langchain codebase, written in python. It belongs to the LangChainCore domain, Runnables subdomain.
What functions are defined in test_prompt_caching.py?
test_prompt_caching.py defines 7 function(s): test_anthropic_prompt_caching_middleware_async, test_anthropic_prompt_caching_middleware_async_default_values, test_anthropic_prompt_caching_middleware_async_min_messages, test_anthropic_prompt_caching_middleware_async_unsupported_model, test_anthropic_prompt_caching_middleware_async_with_system_prompt, test_anthropic_prompt_caching_middleware_initialization, test_anthropic_prompt_caching_middleware_unsupported_model.
What does test_prompt_caching.py depend on?
test_prompt_caching.py imports 12 module(s): langchain.agents.middleware.types, langchain_anthropic.chat_models, langchain_anthropic.middleware, langchain_core.callbacks, langchain_core.language_models, langchain_core.messages, langchain_core.outputs, langgraph.runtime, and 4 more.
Where is test_prompt_caching.py in the architecture?
test_prompt_caching.py is located at libs/partners/anthropic/tests/unit_tests/middleware/test_prompt_caching.py (domain: LangChainCore, subdomain: Runnables, directory: libs/partners/anthropic/tests/unit_tests/middleware).

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free