AIMessageChunk Class — langchain Architecture
Architecture documentation for the AIMessageChunk class in ai.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 17a9b92d_bb83_78d8_7df7_7200745cc17b["AIMessageChunk"] fcfa55b0_4a86_fa31_a156_3c38c76a0a9b["AIMessage"] 17a9b92d_bb83_78d8_7df7_7200745cc17b -->|extends| fcfa55b0_4a86_fa31_a156_3c38c76a0a9b 6a1cf81e_444c_4b27_c510_8e86e205cabb["BaseMessageChunk"] 17a9b92d_bb83_78d8_7df7_7200745cc17b -->|extends| 6a1cf81e_444c_4b27_c510_8e86e205cabb 17a9b92d_bb83_78d8_7df7_7200745cc17b["AIMessageChunk"] 17a9b92d_bb83_78d8_7df7_7200745cc17b -->|extends| 17a9b92d_bb83_78d8_7df7_7200745cc17b 49c04e35_a53c_8d1b_eb09_1203ad7cdacf["ai.py"] 17a9b92d_bb83_78d8_7df7_7200745cc17b -->|defined in| 49c04e35_a53c_8d1b_eb09_1203ad7cdacf 9ffca7bf_5465_28f4_b954_63487dd508d4["lc_attributes()"] 17a9b92d_bb83_78d8_7df7_7200745cc17b -->|method| 9ffca7bf_5465_28f4_b954_63487dd508d4 ca988662_8316_d79d_9219_855f3fb45fa8["content_blocks()"] 17a9b92d_bb83_78d8_7df7_7200745cc17b -->|method| ca988662_8316_d79d_9219_855f3fb45fa8 916248b0_36a0_aaf7_dac6_807fcefc9774["init_tool_calls()"] 17a9b92d_bb83_78d8_7df7_7200745cc17b -->|method| 916248b0_36a0_aaf7_dac6_807fcefc9774 4b21bee9_51ae_cc51_9149_e676878b7709["init_server_tool_calls()"] 17a9b92d_bb83_78d8_7df7_7200745cc17b -->|method| 4b21bee9_51ae_cc51_9149_e676878b7709 c9d2d78b_c172_37e0_c1cf_249d6dd1ac12["__add__()"] 17a9b92d_bb83_78d8_7df7_7200745cc17b -->|method| c9d2d78b_c172_37e0_c1cf_249d6dd1ac12
Relationship Graph
Source Code
libs/core/langchain_core/messages/ai.py lines 413–635
class AIMessageChunk(AIMessage, BaseMessageChunk):
"""Message chunk from an AI (yielded when streaming)."""
# Ignoring mypy re-assignment here since we're overriding the value
# to make sure that the chunk variant can be discriminated from the
# non-chunk variant.
type: Literal["AIMessageChunk"] = "AIMessageChunk" # type: ignore[assignment]
"""The type of the message (used for deserialization)."""
tool_call_chunks: list[ToolCallChunk] = Field(default_factory=list)
"""If provided, tool call chunks associated with the message."""
chunk_position: Literal["last"] | None = None
"""Optional span represented by an aggregated `AIMessageChunk`.
If a chunk with `chunk_position="last"` is aggregated into a stream,
`tool_call_chunks` in message content will be parsed into `tool_calls`.
"""
@property
@override
def lc_attributes(self) -> dict:
return {
"tool_calls": self.tool_calls,
"invalid_tool_calls": self.invalid_tool_calls,
}
@property
def content_blocks(self) -> list[types.ContentBlock]:
"""Return standard, typed `ContentBlock` dicts from the message."""
if self.response_metadata.get("output_version") == "v1":
return cast("list[types.ContentBlock]", self.content)
model_provider = self.response_metadata.get("model_provider")
if model_provider:
from langchain_core.messages.block_translators import ( # noqa: PLC0415
get_translator,
)
translator = get_translator(model_provider)
if translator:
try:
return translator["translate_content_chunk"](self)
except NotImplementedError:
pass
# Otherwise, use best-effort parsing
blocks = super().content_blocks
if (
self.tool_call_chunks
and not self.content
and self.chunk_position != "last" # keep tool_calls if aggregated
):
blocks = [
block
for block in blocks
if block["type"] not in {"tool_call", "invalid_tool_call"}
]
for tool_call_chunk in self.tool_call_chunks:
tc: types.ToolCallChunk = {
"type": "tool_call_chunk",
"id": tool_call_chunk.get("id"),
"name": tool_call_chunk.get("name"),
"args": tool_call_chunk.get("args"),
}
if (idx := tool_call_chunk.get("index")) is not None:
tc["index"] = idx
blocks.append(tc)
# Best-effort reasoning extraction from additional_kwargs
# Only add reasoning if not already present
# Insert before all other blocks to keep reasoning at the start
has_reasoning = any(block.get("type") == "reasoning" for block in blocks)
if not has_reasoning and (
reasoning_block := _extract_reasoning_from_additional_kwargs(self)
):
blocks.insert(0, reasoning_block)
return blocks
Defined In
Source
Frequently Asked Questions
What is the AIMessageChunk class?
AIMessageChunk is a class in the langchain codebase, defined in libs/core/langchain_core/messages/ai.py.
Where is AIMessageChunk defined?
AIMessageChunk is defined in libs/core/langchain_core/messages/ai.py at line 413.
What does AIMessageChunk extend?
AIMessageChunk extends AIMessage, BaseMessageChunk, AIMessageChunk.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free