Home / Function/ _create_chat_result() — langchain Function Reference

_create_chat_result() — langchain Function Reference

Architecture documentation for the _create_chat_result() function in azure.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  eaa333b9_7f00_4950_bb74_8d8bc0041cbc["_create_chat_result()"]
  4b940a4c_8618_3ca0_9fa5_c378f5b8f6ac["AzureChatOpenAI"]
  eaa333b9_7f00_4950_bb74_8d8bc0041cbc -->|defined in| 4b940a4c_8618_3ca0_9fa5_c378f5b8f6ac
  style eaa333b9_7f00_4950_bb74_8d8bc0041cbc fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

libs/partners/openai/langchain_openai/chat_models/azure.py lines 757–794

    def _create_chat_result(
        self,
        response: dict | openai.BaseModel,
        generation_info: dict | None = None,
    ) -> ChatResult:
        chat_result = super()._create_chat_result(response, generation_info)

        if not isinstance(response, dict):
            response = response.model_dump()
        for res in response["choices"]:
            if res.get("finish_reason", None) == "content_filter":
                msg = (
                    "Azure has not provided the response due to a content filter "
                    "being triggered"
                )
                raise ValueError(msg)

        if "model" in response:
            model = response["model"]
            if self.model_version:
                model = f"{model}-{self.model_version}"

            chat_result.llm_output = chat_result.llm_output or {}
            chat_result.llm_output["model_name"] = model
        if "prompt_filter_results" in response:
            chat_result.llm_output = chat_result.llm_output or {}
            chat_result.llm_output["prompt_filter_results"] = response[
                "prompt_filter_results"
            ]
        for chat_gen, response_choice in zip(
            chat_result.generations, response["choices"], strict=False
        ):
            chat_gen.generation_info = chat_gen.generation_info or {}
            chat_gen.generation_info["content_filter_results"] = response_choice.get(
                "content_filter_results", {}
            )

        return chat_result

Domain

Subdomains

Frequently Asked Questions

What does _create_chat_result() do?
_create_chat_result() is a function in the langchain codebase, defined in libs/partners/openai/langchain_openai/chat_models/azure.py.
Where is _create_chat_result() defined?
_create_chat_result() is defined in libs/partners/openai/langchain_openai/chat_models/azure.py at line 757.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free