test_configurable() — langchain Function Reference
Architecture documentation for the test_configurable() function in test_chat_models.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 27f552a5_502a_d862_4baa_e7194c649e84["test_configurable()"] 29aaa9ad_8f95_7e3c_f947_66f656fbb0e8["test_chat_models.py"] 27f552a5_502a_d862_4baa_e7194c649e84 -->|defined in| 29aaa9ad_8f95_7e3c_f947_66f656fbb0e8 style 27f552a5_502a_d862_4baa_e7194c649e84 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/langchain_v1/tests/unit_tests/chat_models/test_chat_models.py lines 107–228
def test_configurable() -> None:
"""Test configurable chat model behavior without default parameters.
Verifies that a configurable chat model initialized without default parameters:
- Has access to all standard runnable methods (`invoke`, `stream`, etc.)
- Blocks access to non-configurable methods until configuration is provided
- Supports declarative operations (`bind_tools`) without mutating original model
- Can chain declarative operations and configuration to access full functionality
- Properly resolves to the configured model type when parameters are provided
Example:
```python
# This creates a configurable model without specifying which model
model = init_chat_model()
# This will FAIL - no model specified yet
model.get_num_tokens("hello") # AttributeError!
# This works - provides model at runtime
response = model.invoke("Hello", config={"configurable": {"model": "gpt-4o"}})
```
"""
model = init_chat_model()
for method in (
"invoke",
"ainvoke",
"batch",
"abatch",
"stream",
"astream",
"batch_as_completed",
"abatch_as_completed",
):
assert hasattr(model, method)
# Doesn't have access non-configurable, non-declarative methods until a config is
# provided.
for method in ("get_num_tokens", "get_num_tokens_from_messages"):
with pytest.raises(AttributeError):
getattr(model, method)
# Can call declarative methods even without a default model.
model_with_tools = model.bind_tools(
[{"name": "foo", "description": "foo", "parameters": {}}],
)
# Check that original model wasn't mutated by declarative operation.
assert model._queued_declarative_operations == []
# Can iteratively call declarative methods.
model_with_config = model_with_tools.with_config(
RunnableConfig(tags=["foo"]),
configurable={"model": "gpt-4o"},
)
assert model_with_config.model_name == "gpt-4o" # type: ignore[attr-defined]
for method in ("get_num_tokens", "get_num_tokens_from_messages"):
assert hasattr(model_with_config, method)
assert model_with_config.model_dump() == { # type: ignore[attr-defined]
"name": None,
"bound": {
"name": None,
"disable_streaming": False,
"disabled_params": None,
"model_name": "gpt-4o",
"temperature": None,
"model_kwargs": {},
"openai_api_key": SecretStr("foo"),
"openai_api_base": None,
"openai_organization": None,
"openai_proxy": None,
"output_version": None,
"request_timeout": None,
"max_retries": None,
"presence_penalty": None,
"reasoning": None,
"reasoning_effort": None,
"verbosity": None,
"frequency_penalty": None,
Domain
Subdomains
Source
Frequently Asked Questions
What does test_configurable() do?
test_configurable() is a function in the langchain codebase, defined in libs/langchain_v1/tests/unit_tests/chat_models/test_chat_models.py.
Where is test_configurable() defined?
test_configurable() is defined in libs/langchain_v1/tests/unit_tests/chat_models/test_chat_models.py at line 107.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free