Home / File/ test_runners.py — anthropic-sdk-python Source File

test_runners.py — anthropic-sdk-python Source File

Architecture documentation for test_runners.py, a python file in the anthropic-sdk-python codebase. 18 imports, 0 dependents.

File python AnthropicClient AsyncClient 18 imports 3 functions 1 classes

Entity Profile

Dependency Diagram

graph LR
  85cebff4_9655_3b56_5c62_eac9bdab7c33["test_runners.py"]
  3f845f92_5648_f5e9_e11a_99fb48e12014["utils"]
  85cebff4_9655_3b56_5c62_eac9bdab7c33 --> 3f845f92_5648_f5e9_e11a_99fb48e12014
  229bd34b_5597_4fff_5a5a_a39630dac5c9["conftest"]
  85cebff4_9655_3b56_5c62_eac9bdab7c33 --> 229bd34b_5597_4fff_5a5a_a39630dac5c9
  6ca3f128_55b8_6054_ffc0_f3ec378adefe["snapshots"]
  85cebff4_9655_3b56_5c62_eac9bdab7c33 --> 6ca3f128_55b8_6054_ffc0_f3ec378adefe
  7c94aefc_19e2_089e_dda9_df9baba206fc["json"]
  85cebff4_9655_3b56_5c62_eac9bdab7c33 --> 7c94aefc_19e2_089e_dda9_df9baba206fc
  d2343a33_148e_8976_dc34_c6fb437c698c["logging"]
  85cebff4_9655_3b56_5c62_eac9bdab7c33 --> d2343a33_148e_8976_dc34_c6fb437c698c
  9e729a88_70db_e7d4_5fad_450d4231b13f["typing"]
  85cebff4_9655_3b56_5c62_eac9bdab7c33 --> 9e729a88_70db_e7d4_5fad_450d4231b13f
  6a9d955d_2702_df7b_b21c_68d8602620ed["typing_extensions"]
  85cebff4_9655_3b56_5c62_eac9bdab7c33 --> 6a9d955d_2702_df7b_b21c_68d8602620ed
  d0f0e693_4cba_28a7_bfc5_90fa7f11a0b5["pytest"]
  85cebff4_9655_3b56_5c62_eac9bdab7c33 --> d0f0e693_4cba_28a7_bfc5_90fa7f11a0b5
  862c9522_b1a3_0fc3_2cf8_1ec4b028e32e["respx"]
  85cebff4_9655_3b56_5c62_eac9bdab7c33 --> 862c9522_b1a3_0fc3_2cf8_1ec4b028e32e
  49f7d893_b01e_bffb_253a_0ad219ef8819["inline_snapshot"]
  85cebff4_9655_3b56_5c62_eac9bdab7c33 --> 49f7d893_b01e_bffb_253a_0ad219ef8819
  cace891e_1645_abce_bdd4_1c29ac8dac9e["anthropic"]
  85cebff4_9655_3b56_5c62_eac9bdab7c33 --> cace891e_1645_abce_bdd4_1c29ac8dac9e
  6ca7eb41_1b3f_0f12_f731_a5169157ef2d["anthropic._utils"]
  85cebff4_9655_3b56_5c62_eac9bdab7c33 --> 6ca7eb41_1b3f_0f12_f731_a5169157ef2d
  9f678c93_f47b_9fd1_9a89_7375919d39b4["anthropic._compat"]
  85cebff4_9655_3b56_5c62_eac9bdab7c33 --> 9f678c93_f47b_9fd1_9a89_7375919d39b4
  5bdf4f3f_5d0b_400d_d7e8_226b3a8d43ca["anthropic.lib.tools"]
  85cebff4_9655_3b56_5c62_eac9bdab7c33 --> 5bdf4f3f_5d0b_400d_d7e8_226b3a8d43ca
  style 85cebff4_9655_3b56_5c62_eac9bdab7c33 fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

import json
import logging
from typing import Any, Dict, List, Union
from typing_extensions import Literal, TypeVar

import pytest
from respx import MockRouter
from inline_snapshot import external, snapshot

from anthropic import Anthropic, AsyncAnthropic, beta_tool, beta_async_tool
from anthropic._utils import assert_signatures_in_sync
from anthropic._compat import PYDANTIC_V1
from anthropic.lib.tools import BetaFunctionToolResultType
from anthropic.lib.tools._beta_runner import BetaToolRunner
from anthropic.types.beta.beta_message import BetaMessage
from anthropic.types.beta.beta_message_param import BetaMessageParam
from anthropic.types.beta.beta_tool_result_block_param import BetaToolResultBlockParam

from ..utils import print_obj
from ...conftest import base_url
from ..snapshots import make_snapshot_request, make_async_snapshot_request, make_stream_snapshot_request

_T = TypeVar("_T")

# all the snapshots in this file are auto-generated from the live API
#
# you can update them with
#
# `ANTHROPIC_LIVE=1 ./scripts/test --inline-snapshot=fix -n0`

snapshots = {
    "basic": {
        "responses": snapshot(
            [
                '{"model": "claude-haiku-4-5-20251001", "id": "msg_0133AjAuLSKXatUZqNkpALPx", "type": "message", "role": "assistant", "content": [{"type": "tool_use", "id": "toolu_01DGiQScbZKPwUBYN79rFUb8", "name": "get_weather", "input": {"location": "San Francisco, CA", "units": "f"}}], "stop_reason": "tool_use", "stop_sequence": null, "usage": {"input_tokens": 656, "cache_creation_input_tokens": 0, "cache_read_input_tokens": 0, "cache_creation": {"ephemeral_5m_input_tokens": 0, "ephemeral_1h_input_tokens": 0}, "output_tokens": 74, "service_tier": "standard"}}',
                '{"model": "claude-haiku-4-5-20251001", "id": "msg_014x2Sxq2p6sewFyUbJp8Mg3", "type": "message", "role": "assistant", "content": [{"type": "text", "text": "The weather in San Francisco, CA is currently **68\\u00b0F** and **Sunny**. It\'s a nice day! \\u2600\\ufe0f"}], "stop_reason": "end_turn", "stop_sequence": null, "usage": {"input_tokens": 770, "cache_creation_input_tokens": 0, "cache_read_input_tokens": 0, "cache_creation": {"ephemeral_5m_input_tokens": 0, "ephemeral_1h_input_tokens": 0}, "output_tokens": 33, "service_tier": "standard"}}',
            ]
        ),
        "result": snapshot(
            "ParsedBetaMessage(container=None, content=[ParsedBetaTextBlock(citations=None, parsed_output=None, text=\"The weather in San Francisco, CA is currently **68°F** and **Sunny**. It's a nice day! ☀️\", type='text')], context_management=None, id='msg_014x2Sxq2p6sewFyUbJp8Mg3', model='claude-haiku-4-5-20251001', role='assistant', stop_reason='end_turn', stop_sequence=None, type='message', usage=BetaUsage(cache_creation=BetaCacheCreation(ephemeral_1h_input_tokens=0, ephemeral_5m_input_tokens=0), cache_creation_input_tokens=0, cache_read_input_tokens=0, inference_geo=None, input_tokens=770, iterations=None, output_tokens=33, server_tool_use=None, service_tier='standard', speed=None))\n"
        ),
    },
    "custom": {
        "responses": snapshot(
            [
                '{"model": "claude-haiku-4-5-20251001", "id": "msg_01FKEKbzbqHmJv5ozwH7tz99", "type": "message", "role": "assistant", "content": [{"type": "text", "text": "Let me check the weather for San Francisco for you in Celsius."}, {"type": "tool_use", "id": "toolu_01MxFFv4azdWzubHT3dXurMY", "name": "get_weather", "input": {"location": "San Francisco, CA", "units": "c"}}], "stop_reason": "tool_use", "stop_sequence": null, "usage": {"input_tokens": 659, "cache_creation_input_tokens": 0, "cache_read_input_tokens": 0, "cache_creation": {"ephemeral_5m_input_tokens": 0, "ephemeral_1h_input_tokens": 0}, "output_tokens": 88, "service_tier": "standard"}}',
                '{"model": "claude-haiku-4-5-20251001", "id": "msg_01DSPL7PHKQYTe9VAFkHzsA3", "type": "message", "role": "assistant", "content": [{"type": "text", "text": "The weather in San Francisco, CA is currently **20\\u00b0C** and **Sunny**. Nice weather!"}], "stop_reason": "end_turn", "stop_sequence": null, "usage": {"input_tokens": 787, "cache_creation_input_tokens": 0, "cache_read_input_tokens": 0, "cache_creation": {"ephemeral_5m_input_tokens": 0, "ephemeral_1h_input_tokens": 0}, "output_tokens": 26, "service_tier": "standard"}}',
            ]
        ),
        "result": snapshot(
            "ParsedBetaMessage(container=None, content=[ParsedBetaTextBlock(citations=None, parsed_output=None, text='The weather in San Francisco, CA is currently **20°C** and **Sunny**. Nice weather!', type='text')], context_management=None, id='msg_01DSPL7PHKQYTe9VAFkHzsA3', model='claude-haiku-4-5-20251001', role='assistant', stop_reason='end_turn', stop_sequence=None, type='message', usage=BetaUsage(cache_creation=BetaCacheCreation(ephemeral_1h_input_tokens=0, ephemeral_5m_input_tokens=0), cache_creation_input_tokens=0, cache_read_input_tokens=0, inference_geo=None, input_tokens=787, iterations=None, output_tokens=26, server_tool_use=None, service_tier='standard', speed=None))\n"
        ),
    },
    "streaming": {
        "result": snapshot(
            "ParsedBetaMessage(container=None, content=[ParsedBetaTextBlock(citations=None, parsed_output=None, text='The weather in San Francisco, CA is currently **Sunny** with a temperature of **68°F**.', type='text')], context_management=None, id='msg_01Vm8Ddgc8qm4iuUSKbf6jku', model='claude-haiku-4-5-20251001', role='assistant', stop_reason='end_turn', stop_sequence=None, type='message', usage=BetaUsage(cache_creation=BetaCacheCreation(ephemeral_1h_input_tokens=0, ephemeral_5m_input_tokens=0), cache_creation_input_tokens=0, cache_read_input_tokens=0, inference_geo=None, input_tokens=781, iterations=None, output_tokens=25, server_tool_use=None, service_tier='standard', speed=None))\n"
        )
    },
    "tool_call": {
        "responses": snapshot(
// ... (556 more lines)

Subdomains

Dependencies

  • anthropic
  • anthropic._compat
  • anthropic._utils
  • anthropic.lib.tools
  • anthropic.lib.tools._beta_runner
  • anthropic.types.beta.beta_message
  • anthropic.types.beta.beta_message_param
  • anthropic.types.beta.beta_tool_result_block_param
  • conftest
  • inline_snapshot
  • json
  • logging
  • pytest
  • respx
  • snapshots
  • typing
  • typing_extensions
  • utils

Frequently Asked Questions

What does test_runners.py do?
test_runners.py is a source file in the anthropic-sdk-python codebase, written in python. It belongs to the AnthropicClient domain, AsyncClient subdomain.
What functions are defined in test_runners.py?
test_runners.py defines 3 function(s): _get_weather, test_basic_call_async, test_tool_runner_method_in_sync.
What does test_runners.py depend on?
test_runners.py imports 18 module(s): anthropic, anthropic._compat, anthropic._utils, anthropic.lib.tools, anthropic.lib.tools._beta_runner, anthropic.types.beta.beta_message, anthropic.types.beta.beta_message_param, anthropic.types.beta.beta_tool_result_block_param, and 10 more.
Where is test_runners.py in the architecture?
test_runners.py is located at tests/lib/tools/test_runners.py (domain: AnthropicClient, subdomain: AsyncClient, directory: tests/lib/tools).

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free