Home / Function/ _create_usage_metadata() — langchain Function Reference

_create_usage_metadata() — langchain Function Reference

Architecture documentation for the _create_usage_metadata() function in chat_models.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  e8fbda3b_7aa9_1575_74a6_35146039904f["_create_usage_metadata()"]
  a85819c7_917d_4c71_2864_a19e68947340["chat_models.py"]
  e8fbda3b_7aa9_1575_74a6_35146039904f -->|defined in| a85819c7_917d_4c71_2864_a19e68947340
  e2be9f1d_bbea_f0b0_96d6_30115dc6ec54["_format_output()"]
  e2be9f1d_bbea_f0b0_96d6_30115dc6ec54 -->|calls| e8fbda3b_7aa9_1575_74a6_35146039904f
  01106d9b_3ac9_a2a5_056c_a55bc89d961b["_make_message_chunk_from_anthropic_event()"]
  01106d9b_3ac9_a2a5_056c_a55bc89d961b -->|calls| e8fbda3b_7aa9_1575_74a6_35146039904f
  style e8fbda3b_7aa9_1575_74a6_35146039904f fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

libs/partners/anthropic/langchain_anthropic/chat_models.py lines 2105–2145

def _create_usage_metadata(anthropic_usage: BaseModel) -> UsageMetadata:
    """Create LangChain `UsageMetadata` from Anthropic `Usage` data.

    Note:
        Anthropic's `input_tokens` excludes cached tokens, so we manually add
        `cache_read` and `cache_creation` tokens to get the true total.
    """
    input_token_details: dict = {
        "cache_read": getattr(anthropic_usage, "cache_read_input_tokens", None),
        "cache_creation": getattr(anthropic_usage, "cache_creation_input_tokens", None),
    }

    # Add cache TTL information if provided (5-minute and 1-hour ephemeral cache)
    cache_creation = getattr(anthropic_usage, "cache_creation", None)

    # Currently just copying over the 5m and 1h keys, but if more are added in the
    # future we'll need to expand this tuple
    cache_creation_keys = ("ephemeral_5m_input_tokens", "ephemeral_1h_input_tokens")
    if cache_creation:
        if isinstance(cache_creation, BaseModel):
            cache_creation = cache_creation.model_dump()
        for k in cache_creation_keys:
            input_token_details[k] = cache_creation.get(k)

    # Calculate total input tokens: Anthropic's `input_tokens` excludes cached tokens,
    # so we need to add them back to get the true total input token count
    input_tokens = (
        (getattr(anthropic_usage, "input_tokens", 0) or 0)  # Base input tokens
        + (input_token_details["cache_read"] or 0)  # Tokens read from cache
        + (input_token_details["cache_creation"] or 0)  # Tokens used to create cache
    )
    output_tokens = getattr(anthropic_usage, "output_tokens", 0) or 0

    return UsageMetadata(
        input_tokens=input_tokens,
        output_tokens=output_tokens,
        total_tokens=input_tokens + output_tokens,
        input_token_details=InputTokenDetails(
            **{k: v for k, v in input_token_details.items() if v is not None},
        ),
    )

Domain

Subdomains

Frequently Asked Questions

What does _create_usage_metadata() do?
_create_usage_metadata() is a function in the langchain codebase, defined in libs/partners/anthropic/langchain_anthropic/chat_models.py.
Where is _create_usage_metadata() defined?
_create_usage_metadata() is defined in libs/partners/anthropic/langchain_anthropic/chat_models.py at line 2105.
What calls _create_usage_metadata()?
_create_usage_metadata() is called by 2 function(s): _format_output, _make_message_chunk_from_anthropic_event.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free