Home / Function/ _invoke() — langchain Function Reference

_invoke() — langchain Function Reference

Architecture documentation for the _invoke() function in test_responses_standard.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  3b067821_fac2_9f7c_2f0e_98992a056f44["_invoke()"]
  cf462a1a_46c7_84ad_3ab6_db148c0ffcc0["test_responses_standard.py"]
  3b067821_fac2_9f7c_2f0e_98992a056f44 -->|defined in| cf462a1a_46c7_84ad_3ab6_db148c0ffcc0
  f94ea83b_6f8c_3303_ea07_185c3e5ae19d["invoke_with_cache_read_input()"]
  f94ea83b_6f8c_3303_ea07_185c3e5ae19d -->|calls| 3b067821_fac2_9f7c_2f0e_98992a056f44
  59088497_61be_dc54_ef74_dbec3fa4367e["invoke_with_reasoning_output()"]
  59088497_61be_dc54_ef74_dbec3fa4367e -->|calls| 3b067821_fac2_9f7c_2f0e_98992a056f44
  style 3b067821_fac2_9f7c_2f0e_98992a056f44 fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

libs/partners/openai/tests/integration_tests/chat_models/test_responses_standard.py lines 130–136

def _invoke(llm: ChatOpenAI, input_: str, stream: bool) -> AIMessage:
    if stream:
        full = None
        for chunk in llm.stream(input_):
            full = full + chunk if full else chunk  # type: ignore[operator]
        return cast(AIMessage, full)
    return cast(AIMessage, llm.invoke(input_))

Domain

Subdomains

Frequently Asked Questions

What does _invoke() do?
_invoke() is a function in the langchain codebase, defined in libs/partners/openai/tests/integration_tests/chat_models/test_responses_standard.py.
Where is _invoke() defined?
_invoke() is defined in libs/partners/openai/tests/integration_tests/chat_models/test_responses_standard.py at line 130.
What calls _invoke()?
_invoke() is called by 2 function(s): invoke_with_cache_read_input, invoke_with_reasoning_output.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free