test_set_default_max_tokens() — langchain Function Reference
Architecture documentation for the test_set_default_max_tokens() function in test_chat_models.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD d3cfb5e9_d29b_c8af_153e_365696fe2ca9["test_set_default_max_tokens()"] 18428dc5_a41b_90c6_88ad_615296ee3311["test_chat_models.py"] d3cfb5e9_d29b_c8af_153e_365696fe2ca9 -->|defined in| 18428dc5_a41b_90c6_88ad_615296ee3311 style d3cfb5e9_d29b_c8af_153e_365696fe2ca9 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/anthropic/tests/unit_tests/test_chat_models.py lines 120–152
def test_set_default_max_tokens() -> None:
"""Test the set_default_max_tokens function."""
# Test claude-sonnet-4-5 models
llm = ChatAnthropic(model="claude-sonnet-4-5-20250929", anthropic_api_key="test")
assert llm.max_tokens == 64000
# Test claude-opus-4 models
llm = ChatAnthropic(model="claude-opus-4-20250514", anthropic_api_key="test")
assert llm.max_tokens == 32000
# Test claude-sonnet-4 models
llm = ChatAnthropic(model="claude-sonnet-4-20250514", anthropic_api_key="test")
assert llm.max_tokens == 64000
# Test claude-3-7-sonnet models
llm = ChatAnthropic(model="claude-3-7-sonnet-20250219", anthropic_api_key="test")
assert llm.max_tokens == 64000
# Test claude-3-5-haiku models
llm = ChatAnthropic(model="claude-3-5-haiku-20241022", anthropic_api_key="test")
assert llm.max_tokens == 8192
# Test claude-3-haiku models (should default to 4096)
llm = ChatAnthropic(model="claude-3-haiku-20240307", anthropic_api_key="test")
assert llm.max_tokens == 4096
# Test that existing max_tokens values are preserved
llm = ChatAnthropic(model=MODEL_NAME, max_tokens=2048, anthropic_api_key="test")
assert llm.max_tokens == 2048
# Test that explicitly set max_tokens values are preserved
llm = ChatAnthropic(model=MODEL_NAME, max_tokens=4096, anthropic_api_key="test")
assert llm.max_tokens == 4096
Domain
Subdomains
Source
Frequently Asked Questions
What does test_set_default_max_tokens() do?
test_set_default_max_tokens() is a function in the langchain codebase, defined in libs/partners/anthropic/tests/unit_tests/test_chat_models.py.
Where is test_set_default_max_tokens() defined?
test_set_default_max_tokens() is defined in libs/partners/anthropic/tests/unit_tests/test_chat_models.py at line 120.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free