Home / File/ test_responses_standard.py — langchain Source File

test_responses_standard.py — langchain Source File

Architecture documentation for test_responses_standard.py, a python file in the langchain codebase. 9 imports, 0 dependents.

File python LangChainCore LanguageModelBase 9 imports 1 functions 1 classes

Entity Profile

Dependency Diagram

graph LR
  cf462a1a_46c7_84ad_3ab6_db148c0ffcc0["test_responses_standard.py"]
  d3750461_97f2_1730_4a69_37dae25b7cb2["base64"]
  cf462a1a_46c7_84ad_3ab6_db148c0ffcc0 --> d3750461_97f2_1730_4a69_37dae25b7cb2
  927570d8_11a6_5c17_0f0d_80baae0c733e["pathlib"]
  cf462a1a_46c7_84ad_3ab6_db148c0ffcc0 --> 927570d8_11a6_5c17_0f0d_80baae0c733e
  feec1ec4_6917_867b_d228_b134d0ff8099["typing"]
  cf462a1a_46c7_84ad_3ab6_db148c0ffcc0 --> feec1ec4_6917_867b_d228_b134d0ff8099
  d2b62b81_6a74_9153_fcd4_ff7470a9b3d2["httpx"]
  cf462a1a_46c7_84ad_3ab6_db148c0ffcc0 --> d2b62b81_6a74_9153_fcd4_ff7470a9b3d2
  f69d6389_263d_68a4_7fbf_f14c0602a9ba["pytest"]
  cf462a1a_46c7_84ad_3ab6_db148c0ffcc0 --> f69d6389_263d_68a4_7fbf_f14c0602a9ba
  e929cf21_6ab8_6ff3_3765_0d35a099a053["langchain_core.language_models"]
  cf462a1a_46c7_84ad_3ab6_db148c0ffcc0 --> e929cf21_6ab8_6ff3_3765_0d35a099a053
  9444498b_8066_55c7_b3a2_1d90c4162a32["langchain_core.messages"]
  cf462a1a_46c7_84ad_3ab6_db148c0ffcc0 --> 9444498b_8066_55c7_b3a2_1d90c4162a32
  2cad93e6_586a_5d28_a74d_4ec6fd4d2227["langchain_openai"]
  cf462a1a_46c7_84ad_3ab6_db148c0ffcc0 --> 2cad93e6_586a_5d28_a74d_4ec6fd4d2227
  4af0fc99_8877_775e_0926_360d22a1d2b7["tests.integration_tests.chat_models.test_base_standard"]
  cf462a1a_46c7_84ad_3ab6_db148c0ffcc0 --> 4af0fc99_8877_775e_0926_360d22a1d2b7
  style cf462a1a_46c7_84ad_3ab6_db148c0ffcc0 fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

"""Standard LangChain interface tests for Responses API"""

import base64
from pathlib import Path
from typing import cast

import httpx
import pytest
from langchain_core.language_models import BaseChatModel
from langchain_core.messages import AIMessage, HumanMessage, ToolMessage

from langchain_openai import ChatOpenAI
from tests.integration_tests.chat_models.test_base_standard import TestOpenAIStandard

REPO_ROOT_DIR = Path(__file__).parents[6]


class TestOpenAIResponses(TestOpenAIStandard):
    @property
    def chat_model_class(self) -> type[BaseChatModel]:
        return ChatOpenAI

    @property
    def chat_model_params(self) -> dict:
        return {"model": "gpt-4o-mini", "use_responses_api": True}

    @property
    def supports_image_tool_message(self) -> bool:
        return True

    @pytest.mark.xfail(reason="Unsupported.")
    def test_stop_sequence(self, model: BaseChatModel) -> None:
        super().test_stop_sequence(model)

    def invoke_with_cache_read_input(self, *, stream: bool = False) -> AIMessage:
        with Path.open(REPO_ROOT_DIR / "README.md") as f:
            readme = f.read()

        input_ = f"""What's langchain? Here's the langchain README:

        {readme}
        """
        llm = ChatOpenAI(model="gpt-4.1-mini", use_responses_api=True)
        _invoke(llm, input_, stream)
        # invoke twice so first invocation is cached
        return _invoke(llm, input_, stream)

    def invoke_with_reasoning_output(self, *, stream: bool = False) -> AIMessage:
        llm = ChatOpenAI(
            model="o4-mini",
            reasoning={"effort": "medium", "summary": "auto"},
            use_responses_api=True,
        )
        input_ = "What was the 3rd highest building in 2000?"
        return _invoke(llm, input_, stream)

    @pytest.mark.flaky(retries=3, delay=1)
    def test_openai_pdf_inputs(self, model: BaseChatModel) -> None:
        """Test that the model can process PDF inputs."""
        super().test_openai_pdf_inputs(model)
// ... (77 more lines)

Domain

Subdomains

Functions

Dependencies

  • base64
  • httpx
  • langchain_core.language_models
  • langchain_core.messages
  • langchain_openai
  • pathlib
  • pytest
  • tests.integration_tests.chat_models.test_base_standard
  • typing

Frequently Asked Questions

What does test_responses_standard.py do?
test_responses_standard.py is a source file in the langchain codebase, written in python. It belongs to the LangChainCore domain, LanguageModelBase subdomain.
What functions are defined in test_responses_standard.py?
test_responses_standard.py defines 1 function(s): _invoke.
What does test_responses_standard.py depend on?
test_responses_standard.py imports 9 module(s): base64, httpx, langchain_core.language_models, langchain_core.messages, langchain_openai, pathlib, pytest, tests.integration_tests.chat_models.test_base_standard, and 1 more.
Where is test_responses_standard.py in the architecture?
test_responses_standard.py is located at libs/partners/openai/tests/integration_tests/chat_models/test_responses_standard.py (domain: LangChainCore, subdomain: LanguageModelBase, directory: libs/partners/openai/tests/integration_tests/chat_models).

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free