_on_llm_end() — langchain Function Reference
Architecture documentation for the _on_llm_end() function in langchain.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 0a3df31a_88bd_f701_6d60_8d6d287acc08["_on_llm_end()"] 178590bb_85ff_b79e_979a_46e5c3c5389f["LangChainTracer"] 0a3df31a_88bd_f701_6d60_8d6d287acc08 -->|defined in| 178590bb_85ff_b79e_979a_46e5c3c5389f bce3427a_d6ea_f317_a362_d28206fa1696["_update_run_single()"] 0a3df31a_88bd_f701_6d60_8d6d287acc08 -->|calls| bce3427a_d6ea_f317_a362_d28206fa1696 706241bd_703f_99a6_9c98_870ee029e748["_get_usage_metadata_from_generations()"] 0a3df31a_88bd_f701_6d60_8d6d287acc08 -->|calls| 706241bd_703f_99a6_9c98_870ee029e748 style 0a3df31a_88bd_f701_6d60_8d6d287acc08 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/core/langchain_core/tracers/langchain.py lines 302–313
def _on_llm_end(self, run: Run) -> None:
"""Process the LLM Run."""
# Extract usage_metadata from outputs and store in extra.metadata
if run.outputs and "generations" in run.outputs:
usage_metadata = _get_usage_metadata_from_generations(
run.outputs["generations"]
)
if usage_metadata is not None:
if "metadata" not in run.extra:
run.extra["metadata"] = {}
run.extra["metadata"]["usage_metadata"] = usage_metadata
self._update_run_single(run)
Domain
Subdomains
Source
Frequently Asked Questions
What does _on_llm_end() do?
_on_llm_end() is a function in the langchain codebase, defined in libs/core/langchain_core/tracers/langchain.py.
Where is _on_llm_end() defined?
_on_llm_end() is defined in libs/core/langchain_core/tracers/langchain.py at line 302.
What does _on_llm_end() call?
_on_llm_end() calls 2 function(s): _get_usage_metadata_from_generations, _update_run_single.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free