Home / Function/ _create_chat_result() — langchain Function Reference

_create_chat_result() — langchain Function Reference

Architecture documentation for the _create_chat_result() function in base.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  64924e61_d0ac_6bbf_4818_aeb51576556d["_create_chat_result()"]
  2a683305_667b_3567_cab9_9f77e29d4afa["BaseChatOpenAI"]
  64924e61_d0ac_6bbf_4818_aeb51576556d -->|defined in| 2a683305_667b_3567_cab9_9f77e29d4afa
  ab5f2298_5c1f_3b25_f23a_6b16e835aba5["_generate()"]
  ab5f2298_5c1f_3b25_f23a_6b16e835aba5 -->|calls| 64924e61_d0ac_6bbf_4818_aeb51576556d
  2d1ed7ab_3dc5_34eb_9bfb_4f79c345fc6b["_get_generation_chunk_from_completion()"]
  2d1ed7ab_3dc5_34eb_9bfb_4f79c345fc6b -->|calls| 64924e61_d0ac_6bbf_4818_aeb51576556d
  a917de1a_4c3d_a1f9_75da_69038ccf1450["_convert_dict_to_message()"]
  64924e61_d0ac_6bbf_4818_aeb51576556d -->|calls| a917de1a_4c3d_a1f9_75da_69038ccf1450
  81a7400d_9d67_48c6_dcb8_b344f5cba3ee["_create_usage_metadata()"]
  64924e61_d0ac_6bbf_4818_aeb51576556d -->|calls| 81a7400d_9d67_48c6_dcb8_b344f5cba3ee
  style 64924e61_d0ac_6bbf_4818_aeb51576556d fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

libs/partners/openai/langchain_openai/chat_models/base.py lines 1480–1547

    def _create_chat_result(
        self,
        response: dict | openai.BaseModel,
        generation_info: dict | None = None,
    ) -> ChatResult:
        generations = []

        response_dict = (
            response if isinstance(response, dict) else response.model_dump()
        )
        # Sometimes the AI Model calling will get error, we should raise it (this is
        # typically followed by a null value for `choices`, which we raise for
        # separately below).
        if response_dict.get("error"):
            raise ValueError(response_dict.get("error"))

        # Raise informative error messages for non-OpenAI chat completions APIs
        # that return malformed responses.
        try:
            choices = response_dict["choices"]
        except KeyError as e:
            msg = f"Response missing `choices` key: {response_dict.keys()}"
            raise KeyError(msg) from e

        if choices is None:
            msg = "Received response with null value for `choices`."
            raise TypeError(msg)

        token_usage = response_dict.get("usage")
        service_tier = response_dict.get("service_tier")

        for res in choices:
            message = _convert_dict_to_message(res["message"])
            if token_usage and isinstance(message, AIMessage):
                message.usage_metadata = _create_usage_metadata(
                    token_usage, service_tier
                )
            generation_info = generation_info or {}
            generation_info["finish_reason"] = (
                res.get("finish_reason")
                if res.get("finish_reason") is not None
                else generation_info.get("finish_reason")
            )
            if "logprobs" in res:
                generation_info["logprobs"] = res["logprobs"]
            gen = ChatGeneration(message=message, generation_info=generation_info)
            generations.append(gen)
        llm_output = {
            "token_usage": token_usage,
            "model_provider": "openai",
            "model_name": response_dict.get("model", self.model_name),
            "system_fingerprint": response_dict.get("system_fingerprint", ""),
        }
        if "id" in response_dict:
            llm_output["id"] = response_dict["id"]
        if service_tier:
            llm_output["service_tier"] = service_tier

        if isinstance(response, openai.BaseModel) and getattr(
            response, "choices", None
        ):
            message = response.choices[0].message  # type: ignore[attr-defined]
            if hasattr(message, "parsed"):
                generations[0].message.additional_kwargs["parsed"] = message.parsed
            if hasattr(message, "refusal"):
                generations[0].message.additional_kwargs["refusal"] = message.refusal

        return ChatResult(generations=generations, llm_output=llm_output)

Domain

Subdomains

Frequently Asked Questions

What does _create_chat_result() do?
_create_chat_result() is a function in the langchain codebase, defined in libs/partners/openai/langchain_openai/chat_models/base.py.
Where is _create_chat_result() defined?
_create_chat_result() is defined in libs/partners/openai/langchain_openai/chat_models/base.py at line 1480.
What does _create_chat_result() call?
_create_chat_result() calls 2 function(s): _convert_dict_to_message, _create_usage_metadata.
What calls _create_chat_result()?
_create_chat_result() is called by 2 function(s): _generate, _get_generation_chunk_from_completion.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free