chat_generation.py — langchain Source File
Architecture documentation for chat_generation.py, a python file in the langchain codebase. 6 imports, 0 dependents.
Entity Profile
Dependency Diagram
graph LR 0b5a237e_9feb_c3f3_db00_596824d6e26d["chat_generation.py"] 8e2034b7_ceb8_963f_29fc_2ea6b50ef9b3["typing"] 0b5a237e_9feb_c3f3_db00_596824d6e26d --> 8e2034b7_ceb8_963f_29fc_2ea6b50ef9b3 6e58aaea_f08e_c099_3cc7_f9567bfb1ae7["pydantic"] 0b5a237e_9feb_c3f3_db00_596824d6e26d --> 6e58aaea_f08e_c099_3cc7_f9567bfb1ae7 d758344f_537f_649e_f467_b9d7442e86df["langchain_core.messages"] 0b5a237e_9feb_c3f3_db00_596824d6e26d --> d758344f_537f_649e_f467_b9d7442e86df 7f324a3e_1ccd_321c_fef1_09174d50081d["langchain_core.outputs.generation"] 0b5a237e_9feb_c3f3_db00_596824d6e26d --> 7f324a3e_1ccd_321c_fef1_09174d50081d 053c6d65_7a74_9819_7c2a_c7357c95d2b8["langchain_core.utils._merge"] 0b5a237e_9feb_c3f3_db00_596824d6e26d --> 053c6d65_7a74_9819_7c2a_c7357c95d2b8 91721f45_4909_e489_8c1f_084f8bd87145["typing_extensions"] 0b5a237e_9feb_c3f3_db00_596824d6e26d --> 91721f45_4909_e489_8c1f_084f8bd87145 style 0b5a237e_9feb_c3f3_db00_596824d6e26d fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
"""Chat generation output classes."""
from __future__ import annotations
from typing import TYPE_CHECKING, Literal
from pydantic import model_validator
from langchain_core.messages import BaseMessage, BaseMessageChunk
from langchain_core.outputs.generation import Generation
from langchain_core.utils._merge import merge_dicts
if TYPE_CHECKING:
from typing_extensions import Self
class ChatGeneration(Generation):
"""A single chat generation output.
A subclass of `Generation` that represents the response from a chat model that
generates chat messages.
The `message` attribute is a structured representation of the chat message. Most of
the time, the message will be of type `AIMessage`.
Users working with chat models will usually access information via either
`AIMessage` (returned from runnable interfaces) or `LLMResult` (available via
callbacks).
"""
text: str = ""
"""The text contents of the output message.
!!! warning "SHOULD NOT BE SET DIRECTLY!"
"""
message: BaseMessage
"""The message output by the chat model."""
# Override type to be ChatGeneration, ignore mypy error as this is intentional
type: Literal["ChatGeneration"] = "ChatGeneration" # type: ignore[assignment]
"""Type is used exclusively for serialization purposes."""
@model_validator(mode="after")
def set_text(self) -> Self:
"""Set the text attribute to be the contents of the message.
Args:
values: The values of the object.
Returns:
The values of the object with the text attribute set.
Raises:
ValueError: If the message is not a string or a list.
"""
text = ""
if isinstance(self.message.content, str):
text = self.message.content
# Extracts first text block from content blocks.
// ... (87 more lines)
Domain
Subdomains
Dependencies
- langchain_core.messages
- langchain_core.outputs.generation
- langchain_core.utils._merge
- pydantic
- typing
- typing_extensions
Source
Frequently Asked Questions
What does chat_generation.py do?
chat_generation.py is a source file in the langchain codebase, written in python. It belongs to the CoreAbstractions domain, MessageSchema subdomain.
What functions are defined in chat_generation.py?
chat_generation.py defines 2 function(s): merge_chat_generation_chunks, typing_extensions.
What does chat_generation.py depend on?
chat_generation.py imports 6 module(s): langchain_core.messages, langchain_core.outputs.generation, langchain_core.utils._merge, pydantic, typing, typing_extensions.
Where is chat_generation.py in the architecture?
chat_generation.py is located at libs/core/langchain_core/outputs/chat_generation.py (domain: CoreAbstractions, subdomain: MessageSchema, directory: libs/core/langchain_core/outputs).
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free