test_context_overflow_error_invoke_async_responses_api() — langchain Function Reference
Architecture documentation for the test_context_overflow_error_invoke_async_responses_api() function in test_base.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD bce1ffec_4611_5f9c_a33c_8f4ca6e1ea99["test_context_overflow_error_invoke_async_responses_api()"] 48232d20_f8c1_b597_14fa_7dc407e9bfe5["test_base.py"] bce1ffec_4611_5f9c_a33c_8f4ca6e1ea99 -->|defined in| 48232d20_f8c1_b597_14fa_7dc407e9bfe5 style bce1ffec_4611_5f9c_a33c_8f4ca6e1ea99 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/openai/tests/unit_tests/chat_models/test_base.py lines 3307–3320
async def test_context_overflow_error_invoke_async_responses_api() -> None:
"""Test context overflow error on invoke (async, responses API)."""
llm = ChatOpenAI(use_responses_api=True)
with ( # noqa: PT012
patch.object(
llm.root_async_client.responses, "with_raw_response"
) as mock_client,
pytest.raises(ContextOverflowError) as exc_info,
):
mock_client.create.side_effect = _CONTEXT_OVERFLOW_BAD_REQUEST_ERROR
await llm.ainvoke([HumanMessage(content="test")])
assert "Input tokens exceed the configured limit" in str(exc_info.value)
Domain
Subdomains
Source
Frequently Asked Questions
What does test_context_overflow_error_invoke_async_responses_api() do?
test_context_overflow_error_invoke_async_responses_api() is a function in the langchain codebase, defined in libs/partners/openai/tests/unit_tests/chat_models/test_base.py.
Where is test_context_overflow_error_invoke_async_responses_api() defined?
test_context_overflow_error_invoke_async_responses_api() is defined in libs/partners/openai/tests/unit_tests/chat_models/test_base.py at line 3307.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free