test_openai_invoke() — langchain Function Reference
Architecture documentation for the test_openai_invoke() function in test_base.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD cc72042c_65fd_5078_8d1f_33b07260a3e9["test_openai_invoke()"] bd382a4e_442c_13ae_530c_6e34bc43623d["test_base.py"] cc72042c_65fd_5078_8d1f_33b07260a3e9 -->|defined in| bd382a4e_442c_13ae_530c_6e34bc43623d style cc72042c_65fd_5078_8d1f_33b07260a3e9 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/openai/tests/integration_tests/chat_models/test_base.py lines 234–263
def test_openai_invoke() -> None:
"""Test invoke tokens from ChatOpenAI."""
llm = ChatOpenAI(
model="gpt-5-nano",
service_tier="flex", # Also test service_tier
max_retries=3, # Add retries for 503 capacity errors
)
result = llm.invoke("Hello", config={"tags": ["foo"]})
assert isinstance(result.content, str)
usage_metadata = result.usage_metadata # type: ignore[attr-defined]
# assert no response headers if include_response_headers is not set
assert "headers" not in result.response_metadata
assert usage_metadata is not None
flex_input = usage_metadata.get("input_token_details", {}).get("flex")
assert isinstance(flex_input, int)
assert flex_input > 0
assert flex_input == usage_metadata.get("input_tokens")
flex_output = usage_metadata.get("output_token_details", {}).get("flex")
assert isinstance(flex_output, int)
assert flex_output > 0
# GPT-5-nano/reasoning model specific. Remove if model used in test changes.
flex_reasoning = usage_metadata.get("output_token_details", {}).get(
"flex_reasoning"
)
assert isinstance(flex_reasoning, int)
assert flex_reasoning > 0
assert flex_reasoning + flex_output == usage_metadata.get("output_tokens")
Domain
Subdomains
Source
Frequently Asked Questions
What does test_openai_invoke() do?
test_openai_invoke() is a function in the langchain codebase, defined in libs/partners/openai/tests/integration_tests/chat_models/test_base.py.
Where is test_openai_invoke() defined?
test_openai_invoke() is defined in libs/partners/openai/tests/integration_tests/chat_models/test_base.py at line 234.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free