Home / Class/ LLMResult Class — langchain Architecture

LLMResult Class — langchain Architecture

Architecture documentation for the LLMResult class in llm_result.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  be2a68ad_6ac6_f078_3bbc_62ebfc7db505["LLMResult"]
  be2a68ad_6ac6_f078_3bbc_62ebfc7db505["LLMResult"]
  be2a68ad_6ac6_f078_3bbc_62ebfc7db505 -->|extends| be2a68ad_6ac6_f078_3bbc_62ebfc7db505
  a3093dbc_5e02_b8ec_51cc_17a33a617388["llm_result.py"]
  be2a68ad_6ac6_f078_3bbc_62ebfc7db505 -->|defined in| a3093dbc_5e02_b8ec_51cc_17a33a617388
  355ab337_7937_6907_7112_0037c8f34822["flatten()"]
  be2a68ad_6ac6_f078_3bbc_62ebfc7db505 -->|method| 355ab337_7937_6907_7112_0037c8f34822
  78ef10a7_f558_70d3_357f_7f9d3b395633["__eq__()"]
  be2a68ad_6ac6_f078_3bbc_62ebfc7db505 -->|method| 78ef10a7_f558_70d3_357f_7f9d3b395633

Relationship Graph

Source Code

libs/core/langchain_core/outputs/llm_result.py lines 15–111

class LLMResult(BaseModel):
    """A container for results of an LLM call.

    Both chat models and LLMs generate an `LLMResult` object. This object contains the
    generated outputs and any additional information that the model provider wants to
    return.
    """

    generations: list[
        list[Generation | ChatGeneration | GenerationChunk | ChatGenerationChunk]
    ]
    """Generated outputs.

    The first dimension of the list represents completions for different input prompts.

    The second dimension of the list represents different candidate generations for a
    given prompt.

    - When returned from **an LLM**, the type is `list[list[Generation]]`.
    - When returned from a **chat model**, the type is `list[list[ChatGeneration]]`.

    `ChatGeneration` is a subclass of `Generation` that has a field for a structured
    chat message.
    """

    llm_output: dict | None = None
    """For arbitrary LLM provider specific output.

    This dictionary is a free-form dictionary that can contain any information that the
    provider wants to return. It is not standardized and is provider-specific.

    Users should generally avoid relying on this field and instead rely on accessing
    relevant information from standardized fields present in AIMessage.
    """

    run: list[RunInfo] | None = None
    """List of metadata info for model call for each input.

    See `langchain_core.outputs.run_info.RunInfo` for details.
    """

    type: Literal["LLMResult"] = "LLMResult"
    """Type is used exclusively for serialization purposes."""

    def flatten(self) -> list[LLMResult]:
        """Flatten generations into a single list.

        Unpack `list[list[Generation]] -> list[LLMResult]` where each returned
        `LLMResult` contains only a single `Generation`. If token usage information is
        available, it is kept only for the `LLMResult` corresponding to the top-choice
        `Generation`, to avoid over-counting of token usage downstream.

        Returns:
            List of `LLMResult` objects where each returned `LLMResult` contains a
                single `Generation`.
        """
        llm_results = []
        for i, gen_list in enumerate(self.generations):
            # Avoid double counting tokens in OpenAICallback
            if i == 0:
                llm_results.append(
                    LLMResult(
                        generations=[gen_list],
                        llm_output=self.llm_output,
                    )
                )
            else:
                if self.llm_output is not None:
                    llm_output = deepcopy(self.llm_output)
                    llm_output["token_usage"] = {}
                else:
                    llm_output = None
                llm_results.append(
                    LLMResult(
                        generations=[gen_list],
                        llm_output=llm_output,
                    )
                )
        return llm_results

    def __eq__(self, other: object) -> bool:

Extends

Frequently Asked Questions

What is the LLMResult class?
LLMResult is a class in the langchain codebase, defined in libs/core/langchain_core/outputs/llm_result.py.
Where is LLMResult defined?
LLMResult is defined in libs/core/langchain_core/outputs/llm_result.py at line 15.
What does LLMResult extend?
LLMResult extends LLMResult.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free