Home / Function/ _generate() — langchain Function Reference

_generate() — langchain Function Reference

Architecture documentation for the _generate() function in base.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  ab5f2298_5c1f_3b25_f23a_6b16e835aba5["_generate()"]
  2a683305_667b_3567_cab9_9f77e29d4afa["BaseChatOpenAI"]
  ab5f2298_5c1f_3b25_f23a_6b16e835aba5 -->|defined in| 2a683305_667b_3567_cab9_9f77e29d4afa
  6086a2e4_e2cd_389e_0d61_cb83485f9aef["_ensure_sync_client_available()"]
  ab5f2298_5c1f_3b25_f23a_6b16e835aba5 -->|calls| 6086a2e4_e2cd_389e_0d61_cb83485f9aef
  36b15b48_0822_029c_4a53_8243405e5a5e["_get_request_payload()"]
  ab5f2298_5c1f_3b25_f23a_6b16e835aba5 -->|calls| 36b15b48_0822_029c_4a53_8243405e5a5e
  64924e61_d0ac_6bbf_4818_aeb51576556d["_create_chat_result()"]
  ab5f2298_5c1f_3b25_f23a_6b16e835aba5 -->|calls| 64924e61_d0ac_6bbf_4818_aeb51576556d
  cdaeec71_b581_c482_d50c_4334d429196c["_is_pydantic_class()"]
  ab5f2298_5c1f_3b25_f23a_6b16e835aba5 -->|calls| cdaeec71_b581_c482_d50c_4334d429196c
  06595fa5_189f_7f73_3a37_309f84e5179d["_construct_lc_result_from_responses_api()"]
  ab5f2298_5c1f_3b25_f23a_6b16e835aba5 -->|calls| 06595fa5_189f_7f73_3a37_309f84e5179d
  6a6e1bc7_82ad_0ec6_6f76_46c87a121099["_handle_openai_bad_request()"]
  ab5f2298_5c1f_3b25_f23a_6b16e835aba5 -->|calls| 6a6e1bc7_82ad_0ec6_6f76_46c87a121099
  9b7290da_4511_6588_b149_1f3f5856fece["_handle_openai_api_error()"]
  ab5f2298_5c1f_3b25_f23a_6b16e835aba5 -->|calls| 9b7290da_4511_6588_b149_1f3f5856fece
  e9866be0_b078_31ff_4159_1d040978b7e3["_use_responses_api()"]
  ab5f2298_5c1f_3b25_f23a_6b16e835aba5 -->|calls| e9866be0_b078_31ff_4159_1d040978b7e3
  style ab5f2298_5c1f_3b25_f23a_6b16e835aba5 fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

libs/partners/openai/langchain_openai/chat_models/base.py lines 1379–1433

    def _generate(
        self,
        messages: list[BaseMessage],
        stop: list[str] | None = None,
        run_manager: CallbackManagerForLLMRun | None = None,
        **kwargs: Any,
    ) -> ChatResult:
        self._ensure_sync_client_available()
        payload = self._get_request_payload(messages, stop=stop, **kwargs)
        generation_info = None
        raw_response = None
        try:
            if "response_format" in payload:
                payload.pop("stream")
                raw_response = (
                    self.root_client.chat.completions.with_raw_response.parse(**payload)
                )
                response = raw_response.parse()
            elif self._use_responses_api(payload):
                original_schema_obj = kwargs.get("response_format")
                if original_schema_obj and _is_pydantic_class(original_schema_obj):
                    raw_response = self.root_client.responses.with_raw_response.parse(
                        **payload
                    )
                else:
                    raw_response = self.root_client.responses.with_raw_response.create(
                        **payload
                    )
                response = raw_response.parse()
                if self.include_response_headers:
                    generation_info = {"headers": dict(raw_response.headers)}
                return _construct_lc_result_from_responses_api(
                    response,
                    schema=original_schema_obj,
                    metadata=generation_info,
                    output_version=self.output_version,
                )
            else:
                raw_response = self.client.with_raw_response.create(**payload)
                response = raw_response.parse()
        except openai.BadRequestError as e:
            _handle_openai_bad_request(e)
        except openai.APIError as e:
            _handle_openai_api_error(e)
        except Exception as e:
            if raw_response is not None and hasattr(raw_response, "http_response"):
                e.response = raw_response.http_response  # type: ignore[attr-defined]
            raise e
        if (
            self.include_response_headers
            and raw_response is not None
            and hasattr(raw_response, "headers")
        ):
            generation_info = {"headers": dict(raw_response.headers)}
        return self._create_chat_result(response, generation_info)

Domain

Subdomains

Frequently Asked Questions

What does _generate() do?
_generate() is a function in the langchain codebase, defined in libs/partners/openai/langchain_openai/chat_models/base.py.
Where is _generate() defined?
_generate() is defined in libs/partners/openai/langchain_openai/chat_models/base.py at line 1379.
What does _generate() call?
_generate() calls 8 function(s): _construct_lc_result_from_responses_api, _create_chat_result, _ensure_sync_client_available, _get_request_payload, _handle_openai_api_error, _handle_openai_bad_request, _is_pydantic_class, _use_responses_api.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free