test_file_search() — langchain Function Reference
Architecture documentation for the test_file_search() function in test_responses_api.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 24c11427_0fa4_2b54_2137_182ca37afbc3["test_file_search()"] 992496d5_b7d4_139f_00cf_3e585d851f81["test_responses_api.py"] 24c11427_0fa4_2b54_2137_182ca37afbc3 -->|defined in| 992496d5_b7d4_139f_00cf_3e585d851f81 b0966d53_e5bb_3879_d8d6_00823de68309["_check_response()"] 24c11427_0fa4_2b54_2137_182ca37afbc3 -->|calls| b0966d53_e5bb_3879_d8d6_00823de68309 style 24c11427_0fa4_2b54_2137_182ca37afbc3 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/openai/tests/integration_tests/chat_models/test_responses_api.py lines 412–469
def test_file_search(
output_version: Literal["responses/v1", "v1"],
) -> None:
vector_store_id = os.getenv("OPENAI_VECTOR_STORE_ID")
if not vector_store_id:
pytest.skip()
llm = ChatOpenAI(
model=MODEL_NAME,
use_responses_api=True,
output_version=output_version,
)
tool = {
"type": "file_search",
"vector_store_ids": [vector_store_id],
}
input_message = {"role": "user", "content": "What is deep research by OpenAI?"}
response = llm.invoke([input_message], tools=[tool])
_check_response(response)
if output_version == "v1":
assert [block["type"] for block in response.content] == [ # type: ignore[index]
"server_tool_call",
"server_tool_result",
"text",
]
else:
assert [block["type"] for block in response.content] == [ # type: ignore[index]
"file_search_call",
"text",
]
full: AIMessageChunk | None = None
for chunk in llm.stream([input_message], tools=[tool]):
assert isinstance(chunk, AIMessageChunk)
full = chunk if full is None else full + chunk
assert isinstance(full, AIMessageChunk)
_check_response(full)
if output_version == "v1":
assert [block["type"] for block in full.content] == [ # type: ignore[index]
"server_tool_call",
"server_tool_result",
"text",
]
else:
assert [block["type"] for block in full.content] == ["file_search_call", "text"] # type: ignore[index]
next_message = {"role": "user", "content": "Thank you."}
_ = llm.invoke([input_message, full, next_message])
for message in [response, full]:
assert [block["type"] for block in message.content_blocks] == [
"server_tool_call",
"server_tool_result",
"text",
]
Domain
Subdomains
Calls
Source
Frequently Asked Questions
What does test_file_search() do?
test_file_search() is a function in the langchain codebase, defined in libs/partners/openai/tests/integration_tests/chat_models/test_responses_api.py.
Where is test_file_search() defined?
test_file_search() is defined in libs/partners/openai/tests/integration_tests/chat_models/test_responses_api.py at line 412.
What does test_file_search() call?
test_file_search() calls 1 function(s): _check_response.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free