on_llm_end() — langchain Function Reference
Architecture documentation for the on_llm_end() function in event_stream.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD b4d93379_4f7c_1456_ee6f_f9bf387372a3["on_llm_end()"] 33d093c4_1ed0_fc6a_17c6_762d4c5cfa04["_AstreamEventsCallbackHandler"] b4d93379_4f7c_1456_ee6f_f9bf387372a3 -->|defined in| 33d093c4_1ed0_fc6a_17c6_762d4c5cfa04 87f79bee_f9c5_8262_1829_62f633c4f870["_send()"] b4d93379_4f7c_1456_ee6f_f9bf387372a3 -->|calls| 87f79bee_f9c5_8262_1829_62f633c4f870 9aed8e4f_9d4c_016f_aa43_c5908015cf8d["_get_parent_ids()"] b4d93379_4f7c_1456_ee6f_f9bf387372a3 -->|calls| 9aed8e4f_9d4c_016f_aa43_c5908015cf8d style b4d93379_4f7c_1456_ee6f_f9bf387372a3 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/core/langchain_core/tracers/event_stream.py lines 489–547
async def on_llm_end(
self, response: LLMResult, *, run_id: UUID, **kwargs: Any
) -> None:
"""End a trace for a model run.
For both chat models and non-chat models (legacy text-completion LLMs).
Raises:
ValueError: If the run type is not `'llm'` or `'chat_model'`.
"""
run_info = self.run_map.pop(run_id)
inputs_ = run_info.get("inputs")
generations: list[list[GenerationChunk]] | list[list[ChatGenerationChunk]]
output: dict | BaseMessage = {}
if run_info["run_type"] == "chat_model":
generations = cast("list[list[ChatGenerationChunk]]", response.generations)
for gen in generations:
if output != {}:
break
for chunk in gen:
output = chunk.message
break
event = "on_chat_model_end"
elif run_info["run_type"] == "llm":
generations = cast("list[list[GenerationChunk]]", response.generations)
output = {
"generations": [
[
{
"text": chunk.text,
"generation_info": chunk.generation_info,
"type": chunk.type,
}
for chunk in gen
]
for gen in generations
],
"llm_output": response.llm_output,
}
event = "on_llm_end"
else:
msg = f"Unexpected run type: {run_info['run_type']}"
raise ValueError(msg)
self._send(
{
"event": event,
"data": {"output": output, "input": inputs_},
"run_id": str(run_id),
"name": run_info["name"],
"tags": run_info["tags"],
"metadata": run_info["metadata"],
"parent_ids": self._get_parent_ids(run_id),
},
run_info["run_type"],
)
Domain
Subdomains
Calls
Source
Frequently Asked Questions
What does on_llm_end() do?
on_llm_end() is a function in the langchain codebase, defined in libs/core/langchain_core/tracers/event_stream.py.
Where is on_llm_end() defined?
on_llm_end() is defined in libs/core/langchain_core/tracers/event_stream.py at line 489.
What does on_llm_end() call?
on_llm_end() calls 2 function(s): _get_parent_ids, _send.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free