on_llm_end() — langchain Function Reference
Architecture documentation for the on_llm_end() function in usage.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 9fa49e0e_3ef5_6eac_9062_77f7927be642["on_llm_end()"] b3bf0e8c_5ef8_a03f_bf9e_d1c27e7284b2["UsageMetadataCallbackHandler"] 9fa49e0e_3ef5_6eac_9062_77f7927be642 -->|defined in| b3bf0e8c_5ef8_a03f_bf9e_d1c27e7284b2 style 9fa49e0e_3ef5_6eac_9062_77f7927be642 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/core/langchain_core/callbacks/usage.py lines 62–89
def on_llm_end(self, response: LLMResult, **kwargs: Any) -> None:
"""Collect token usage."""
# Check for usage_metadata (langchain-core >= 0.2.2)
try:
generation = response.generations[0][0]
except IndexError:
generation = None
usage_metadata = None
model_name = None
if isinstance(generation, ChatGeneration):
try:
message = generation.message
if isinstance(message, AIMessage):
usage_metadata = message.usage_metadata
model_name = message.response_metadata.get("model_name")
except AttributeError:
pass
# update shared state behind lock
if usage_metadata and model_name:
with self._lock:
if model_name not in self.usage_metadata:
self.usage_metadata[model_name] = usage_metadata
else:
self.usage_metadata[model_name] = add_usage(
self.usage_metadata[model_name], usage_metadata
)
Domain
Subdomains
Defined In
Source
Frequently Asked Questions
What does on_llm_end() do?
on_llm_end() is a function in the langchain codebase, defined in libs/core/langchain_core/callbacks/usage.py.
Where is on_llm_end() defined?
on_llm_end() is defined in libs/core/langchain_core/callbacks/usage.py at line 62.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free