Home / Function/ _create_chat_result() — langchain Function Reference

_create_chat_result() — langchain Function Reference

Architecture documentation for the _create_chat_result() function in chat_models.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  64320ac7_229e_69ac_e55f_504a7f3a1ee4["_create_chat_result()"]
  d5ca3c3a_3c29_0cb2_a156_35c92a31f5fd["ChatGroq"]
  64320ac7_229e_69ac_e55f_504a7f3a1ee4 -->|defined in| d5ca3c3a_3c29_0cb2_a156_35c92a31f5fd
  af2bf0e7_66f5_c6a1_a664_d03828bcbf1b["_generate()"]
  af2bf0e7_66f5_c6a1_a664_d03828bcbf1b -->|calls| 64320ac7_229e_69ac_e55f_504a7f3a1ee4
  a59adcb5_ffbb_8e47_2ed1_52a3de26c52e["_agenerate()"]
  a59adcb5_ffbb_8e47_2ed1_52a3de26c52e -->|calls| 64320ac7_229e_69ac_e55f_504a7f3a1ee4
  fd171c90_ac0f_ba1e_69ee_481dcec39d70["_convert_dict_to_message()"]
  64320ac7_229e_69ac_e55f_504a7f3a1ee4 -->|calls| fd171c90_ac0f_ba1e_69ee_481dcec39d70
  8b9223c9_7a1b_f19b_a11a_7832c51d7a4b["_create_usage_metadata()"]
  64320ac7_229e_69ac_e55f_504a7f3a1ee4 -->|calls| 8b9223c9_7a1b_f19b_a11a_7832c51d7a4b
  style 64320ac7_229e_69ac_e55f_504a7f3a1ee4 fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

libs/partners/groq/langchain_groq/chat_models.py lines 774–802

    def _create_chat_result(
        self, response: dict | BaseModel, params: dict
    ) -> ChatResult:
        generations = []
        if not isinstance(response, dict):
            response = response.model_dump()
        token_usage = response.get("usage", {})
        for res in response["choices"]:
            message = _convert_dict_to_message(res["message"])
            if token_usage and isinstance(message, AIMessage):
                message.usage_metadata = _create_usage_metadata(token_usage)
            generation_info = {"finish_reason": res.get("finish_reason")}
            if "logprobs" in res:
                generation_info["logprobs"] = res["logprobs"]
            gen = ChatGeneration(
                message=message,
                generation_info=generation_info,
            )
            generations.append(gen)
        llm_output = {
            "token_usage": token_usage,
            "model_name": self.model_name,
            "system_fingerprint": response.get("system_fingerprint", ""),
        }
        llm_output["service_tier"] = params.get("service_tier") or self.service_tier
        reasoning_effort = params.get("reasoning_effort") or self.reasoning_effort
        if reasoning_effort:
            llm_output["reasoning_effort"] = reasoning_effort
        return ChatResult(generations=generations, llm_output=llm_output)

Domain

Subdomains

Frequently Asked Questions

What does _create_chat_result() do?
_create_chat_result() is a function in the langchain codebase, defined in libs/partners/groq/langchain_groq/chat_models.py.
Where is _create_chat_result() defined?
_create_chat_result() is defined in libs/partners/groq/langchain_groq/chat_models.py at line 774.
What does _create_chat_result() call?
_create_chat_result() calls 2 function(s): _convert_dict_to_message, _create_usage_metadata.
What calls _create_chat_result()?
_create_chat_result() is called by 2 function(s): _agenerate, _generate.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free