Home / Function/ test_get_num_tokens_from_messages() — langchain Function Reference

test_get_num_tokens_from_messages() — langchain Function Reference

Architecture documentation for the test_get_num_tokens_from_messages() function in test_base.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  53f2de9a_6f50_f570_6b60_a9b34d293627["test_get_num_tokens_from_messages()"]
  48232d20_f8c1_b597_14fa_7dc407e9bfe5["test_base.py"]
  53f2de9a_6f50_f570_6b60_a9b34d293627 -->|defined in| 48232d20_f8c1_b597_14fa_7dc407e9bfe5
  style 53f2de9a_6f50_f570_6b60_a9b34d293627 fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

libs/partners/openai/tests/unit_tests/chat_models/test_base.py lines 939–1031

def test_get_num_tokens_from_messages() -> None:
    llm = ChatOpenAI(model="gpt-4o")
    messages = [
        SystemMessage("you're a good assistant"),
        HumanMessage("how are you"),
        HumanMessage(
            [
                {"type": "text", "text": "what's in this image"},
                {"type": "image_url", "image_url": {"url": "https://foobar.com"}},
                {
                    "type": "image_url",
                    "image_url": {"url": "https://foobar.com", "detail": "low"},
                },
            ]
        ),
        AIMessage("a nice bird"),
        AIMessage(
            "",
            tool_calls=[
                ToolCall(id="foo", name="bar", args={"arg1": "arg1"}, type="tool_call")
            ],
        ),
        AIMessage(
            "",
            additional_kwargs={
                "function_call": {
                    "arguments": json.dumps({"arg1": "arg1"}),
                    "name": "fun",
                }
            },
        ),
        AIMessage(
            "text",
            tool_calls=[
                ToolCall(id="foo", name="bar", args={"arg1": "arg1"}, type="tool_call")
            ],
        ),
        ToolMessage("foobar", tool_call_id="foo"),
    ]
    expected = 431  # Updated to match token count with mocked 100x100 image

    # Mock _url_to_size to avoid PIL dependency in unit tests
    with patch("langchain_openai.chat_models.base._url_to_size") as mock_url_to_size:
        mock_url_to_size.return_value = (100, 100)  # 100x100 pixel image
        actual = llm.get_num_tokens_from_messages(messages)

    assert expected == actual

    # Test file inputs
    messages = [
        HumanMessage(
            [
                "Summarize this document.",
                {
                    "type": "file",
                    "file": {
                        "filename": "my file",
                        "file_data": "data:application/pdf;base64,<data>",
                    },
                },
            ]
        )
    ]
    actual = 0
    with pytest.warns(match="file inputs are not supported"):
        actual = llm.get_num_tokens_from_messages(messages)
    assert actual == 13

    # Test Responses
    messages = [
        AIMessage(
            [
                {
                    "type": "function_call",
                    "name": "multiply",
                    "arguments": '{"x":5,"y":4}',
                    "call_id": "call_abc123",
                    "id": "fc_abc123",
                    "status": "completed",
                },
            ],

Domain

Subdomains

Frequently Asked Questions

What does test_get_num_tokens_from_messages() do?
test_get_num_tokens_from_messages() is a function in the langchain codebase, defined in libs/partners/openai/tests/unit_tests/chat_models/test_base.py.
Where is test_get_num_tokens_from_messages() defined?
test_get_num_tokens_from_messages() is defined in libs/partners/openai/tests/unit_tests/chat_models/test_base.py at line 939.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free