test__construct_lc_result_from_responses_api_basic_text_response() — langchain Function Reference
Architecture documentation for the test__construct_lc_result_from_responses_api_basic_text_response() function in test_base.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 333c0e53_410e_ef08_e869_6e884df1b110["test__construct_lc_result_from_responses_api_basic_text_response()"] 48232d20_f8c1_b597_14fa_7dc407e9bfe5["test_base.py"] 333c0e53_410e_ef08_e869_6e884df1b110 -->|defined in| 48232d20_f8c1_b597_14fa_7dc407e9bfe5 style 333c0e53_410e_ef08_e869_6e884df1b110 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/openai/tests/unit_tests/chat_models/test_base.py lines 1392–1448
def test__construct_lc_result_from_responses_api_basic_text_response() -> None:
"""Test a basic text response with no tools or special features."""
response = Response(
id="resp_123",
created_at=1234567890,
model="gpt-4o",
object="response",
parallel_tool_calls=True,
tools=[],
tool_choice="auto",
output=[
ResponseOutputMessage(
type="message",
id="msg_123",
content=[
ResponseOutputText(
type="output_text", text="Hello, world!", annotations=[]
)
],
role="assistant",
status="completed",
)
],
usage=ResponseUsage(
input_tokens=10,
output_tokens=3,
total_tokens=13,
input_tokens_details=InputTokensDetails(cached_tokens=0),
output_tokens_details=OutputTokensDetails(reasoning_tokens=0),
),
)
# v0
result = _construct_lc_result_from_responses_api(response, output_version="v0")
assert isinstance(result, ChatResult)
assert len(result.generations) == 1
assert isinstance(result.generations[0], ChatGeneration)
assert isinstance(result.generations[0].message, AIMessage)
assert result.generations[0].message.content == [
{"type": "text", "text": "Hello, world!", "annotations": []}
]
assert result.generations[0].message.id == "msg_123"
assert result.generations[0].message.usage_metadata
assert result.generations[0].message.usage_metadata["input_tokens"] == 10
assert result.generations[0].message.usage_metadata["output_tokens"] == 3
assert result.generations[0].message.usage_metadata["total_tokens"] == 13
assert result.generations[0].message.response_metadata["id"] == "resp_123"
assert result.generations[0].message.response_metadata["model_name"] == "gpt-4o"
# responses/v1
result = _construct_lc_result_from_responses_api(response)
assert result.generations[0].message.content == [
{"type": "text", "text": "Hello, world!", "annotations": [], "id": "msg_123"}
]
assert result.generations[0].message.id == "resp_123"
assert result.generations[0].message.response_metadata["id"] == "resp_123"
Domain
Subdomains
Source
Frequently Asked Questions
What does test__construct_lc_result_from_responses_api_basic_text_response() do?
test__construct_lc_result_from_responses_api_basic_text_response() is a function in the langchain codebase, defined in libs/partners/openai/tests/unit_tests/chat_models/test_base.py.
Where is test__construct_lc_result_from_responses_api_basic_text_response() defined?
test__construct_lc_result_from_responses_api_basic_text_response() is defined in libs/partners/openai/tests/unit_tests/chat_models/test_base.py at line 1392.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free