test_configurable_with_default() — langchain Function Reference
Architecture documentation for the test_configurable_with_default() function in test_chat_models.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 53fadada_afe1_3e86_331e_71f72cd3eca6["test_configurable_with_default()"] 29aaa9ad_8f95_7e3c_f947_66f656fbb0e8["test_chat_models.py"] 53fadada_afe1_3e86_331e_71f72cd3eca6 -->|defined in| 29aaa9ad_8f95_7e3c_f947_66f656fbb0e8 style 53fadada_afe1_3e86_331e_71f72cd3eca6 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/langchain_v1/tests/unit_tests/chat_models/test_chat_models.py lines 237–337
def test_configurable_with_default() -> None:
"""Test configurable chat model behavior with default parameters.
Verifies that a configurable chat model initialized with default parameters:
- Has access to all standard runnable methods (`invoke`, `stream`, etc.)
- Provides immediate access to non-configurable methods (e.g. `get_num_tokens`)
- Supports model switching through runtime configuration using `config_prefix`
- Maintains proper model identity and attributes when reconfigured
- Can be used in chains with different model providers via configuration
Example:
```python
# This creates a configurable model with default parameters (model)
model = init_chat_model("gpt-4o", configurable_fields="any", config_prefix="bar")
# This works immediately - uses default gpt-4o
tokens = model.get_num_tokens("hello")
# This also works - switches to Claude at runtime
response = model.invoke(
"Hello", config={"configurable": {"my_model_model": "claude-3-sonnet-20240229"}}
)
```
"""
model = init_chat_model("gpt-4o", configurable_fields="any", config_prefix="bar")
for method in (
"invoke",
"ainvoke",
"batch",
"abatch",
"stream",
"astream",
"batch_as_completed",
"abatch_as_completed",
):
assert hasattr(model, method)
# Does have access non-configurable, non-declarative methods since default params
# are provided.
for method in ("get_num_tokens", "get_num_tokens_from_messages", "dict"):
assert hasattr(model, method)
assert model.model_name == "gpt-4o"
model_with_tools = model.bind_tools(
[{"name": "foo", "description": "foo", "parameters": {}}],
)
model_with_config = model_with_tools.with_config(
RunnableConfig(tags=["foo"]),
configurable={"bar_model": "claude-sonnet-4-5-20250929"},
)
assert model_with_config.model == "claude-sonnet-4-5-20250929" # type: ignore[attr-defined]
assert model_with_config.model_dump() == { # type: ignore[attr-defined]
"name": None,
"bound": {
"name": None,
"disable_streaming": False,
"effort": None,
"model": "claude-sonnet-4-5-20250929",
"mcp_servers": None,
"max_tokens": 64000,
"temperature": None,
"thinking": None,
"top_k": None,
"top_p": None,
"default_request_timeout": None,
"max_retries": 2,
"stop_sequences": None,
"anthropic_api_url": "https://api.anthropic.com",
"anthropic_proxy": None,
"context_management": None,
"anthropic_api_key": SecretStr("bar"),
"betas": None,
"default_headers": None,
"model_kwargs": {},
"reuse_last_container": None,
"inference_geo": None,
"streaming": False,
Domain
Subdomains
Source
Frequently Asked Questions
What does test_configurable_with_default() do?
test_configurable_with_default() is a function in the langchain codebase, defined in libs/langchain_v1/tests/unit_tests/chat_models/test_chat_models.py.
Where is test_configurable_with_default() defined?
test_configurable_with_default() is defined in libs/langchain_v1/tests/unit_tests/chat_models/test_chat_models.py at line 237.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free