Home / Function/ _combine_llm_outputs() — langchain Function Reference

_combine_llm_outputs() — langchain Function Reference

Architecture documentation for the _combine_llm_outputs() function in base.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  e723f22f_4b01_a370_0d2f_ecea13f9c923["_combine_llm_outputs()"]
  2a683305_667b_3567_cab9_9f77e29d4afa["BaseChatOpenAI"]
  e723f22f_4b01_a370_0d2f_ecea13f9c923 -->|defined in| 2a683305_667b_3567_cab9_9f77e29d4afa
  c5c2540e_8622_6410_cb0a_4b60c4805659["_update_token_usage()"]
  e723f22f_4b01_a370_0d2f_ecea13f9c923 -->|calls| c5c2540e_8622_6410_cb0a_4b60c4805659
  style e723f22f_4b01_a370_0d2f_ecea13f9c923 fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

libs/partners/openai/langchain_openai/chat_models/base.py lines 1067–1090

    def _combine_llm_outputs(self, llm_outputs: list[dict | None]) -> dict:
        overall_token_usage: dict = {}
        system_fingerprint = None
        for output in llm_outputs:
            if output is None:
                # Happens in streaming
                continue
            token_usage = output.get("token_usage")
            if token_usage is not None:
                for k, v in token_usage.items():
                    if v is None:
                        continue
                    if k in overall_token_usage:
                        overall_token_usage[k] = _update_token_usage(
                            overall_token_usage[k], v
                        )
                    else:
                        overall_token_usage[k] = v
            if system_fingerprint is None:
                system_fingerprint = output.get("system_fingerprint")
        combined = {"token_usage": overall_token_usage, "model_name": self.model_name}
        if system_fingerprint:
            combined["system_fingerprint"] = system_fingerprint
        return combined

Domain

Subdomains

Frequently Asked Questions

What does _combine_llm_outputs() do?
_combine_llm_outputs() is a function in the langchain codebase, defined in libs/partners/openai/langchain_openai/chat_models/base.py.
Where is _combine_llm_outputs() defined?
_combine_llm_outputs() is defined in libs/partners/openai/langchain_openai/chat_models/base.py at line 1067.
What does _combine_llm_outputs() call?
_combine_llm_outputs() calls 1 function(s): _update_token_usage.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free