Home / Function/ test_get_num_tokens_from_messages() — langchain Function Reference

test_get_num_tokens_from_messages() — langchain Function Reference

Architecture documentation for the test_get_num_tokens_from_messages() function in test_chat_models.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  33102fb1_0de7_af6e_c1f4_e7dde40b0d7a["test_get_num_tokens_from_messages()"]
  f27640dd_3870_5548_d153_f9504ae1021f["test_chat_models.py"]
  33102fb1_0de7_af6e_c1f4_e7dde40b0d7a -->|defined in| f27640dd_3870_5548_d153_f9504ae1021f
  style 33102fb1_0de7_af6e_c1f4_e7dde40b0d7a fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

libs/partners/anthropic/tests/integration_tests/test_chat_models.py lines 793–845

def test_get_num_tokens_from_messages() -> None:
    llm = ChatAnthropic(model=MODEL_NAME)  # type: ignore[call-arg]

    # Test simple case
    messages = [
        SystemMessage(content="You are a scientist"),
        HumanMessage(content="Hello, Claude"),
    ]
    num_tokens = llm.get_num_tokens_from_messages(messages)
    assert num_tokens > 0

    # Test tool use
    @tool(parse_docstring=True)
    def get_weather(location: str) -> str:
        """Get the current weather in a given location.

        Args:
            location: The city and state, e.g. San Francisco, CA

        """
        return "Sunny"

    messages = [
        HumanMessage(content="What's the weather like in San Francisco?"),
    ]
    num_tokens = llm.get_num_tokens_from_messages(messages, tools=[get_weather])
    assert num_tokens > 0

    messages = [
        HumanMessage(content="What's the weather like in San Francisco?"),
        AIMessage(
            content=[
                {"text": "Let's see.", "type": "text"},
                {
                    "id": "toolu_01V6d6W32QGGSmQm4BT98EKk",
                    "input": {"location": "SF"},
                    "name": "get_weather",
                    "type": "tool_use",
                },
            ],
            tool_calls=[
                {
                    "name": "get_weather",
                    "args": {"location": "SF"},
                    "id": "toolu_01V6d6W32QGGSmQm4BT98EKk",
                    "type": "tool_call",
                },
            ],
        ),
        ToolMessage(content="Sunny", tool_call_id="toolu_01V6d6W32QGGSmQm4BT98EKk"),
    ]
    num_tokens = llm.get_num_tokens_from_messages(messages, tools=[get_weather])
    assert num_tokens > 0

Domain

Subdomains

Frequently Asked Questions

What does test_get_num_tokens_from_messages() do?
test_get_num_tokens_from_messages() is a function in the langchain codebase, defined in libs/partners/anthropic/tests/integration_tests/test_chat_models.py.
Where is test_get_num_tokens_from_messages() defined?
test_get_num_tokens_from_messages() is defined in libs/partners/anthropic/tests/integration_tests/test_chat_models.py at line 793.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free