test_gpt_5_temperature() — langchain Function Reference
Architecture documentation for the test_gpt_5_temperature() function in test_base.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD e1ad9ff5_bc50_fbcf_4a76_7729b0334fad["test_gpt_5_temperature()"] 48232d20_f8c1_b597_14fa_7dc407e9bfe5["test_base.py"] e1ad9ff5_bc50_fbcf_4a76_7729b0334fad -->|defined in| 48232d20_f8c1_b597_14fa_7dc407e9bfe5 style e1ad9ff5_bc50_fbcf_4a76_7729b0334fad fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/openai/tests/unit_tests/chat_models/test_base.py lines 3087–3101
def test_gpt_5_temperature(use_responses_api: bool) -> None:
llm = ChatOpenAI(
model="gpt-5-nano", temperature=0.5, use_responses_api=use_responses_api
)
messages = [HumanMessage(content="Hello")]
payload = llm._get_request_payload(messages)
assert "temperature" not in payload # not supported for gpt-5 family models
llm = ChatOpenAI(
model="gpt-5-chat", temperature=0.5, use_responses_api=use_responses_api
)
messages = [HumanMessage(content="Hello")]
payload = llm._get_request_payload(messages)
assert payload["temperature"] == 0.5 # gpt-5-chat is exception
Domain
Subdomains
Source
Frequently Asked Questions
What does test_gpt_5_temperature() do?
test_gpt_5_temperature() is a function in the langchain codebase, defined in libs/partners/openai/tests/unit_tests/chat_models/test_base.py.
Where is test_gpt_5_temperature() defined?
test_gpt_5_temperature() is defined in libs/partners/openai/tests/unit_tests/chat_models/test_base.py at line 3087.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free