LLMManagerMixin Class — langchain Architecture
Architecture documentation for the LLMManagerMixin class in base.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD ec453097_9dd7_3edc_a70f_a6b65b8f3e63["LLMManagerMixin"] aa78d849_32e0_cbe3_8323_1a62fafa0824["base.py"] ec453097_9dd7_3edc_a70f_a6b65b8f3e63 -->|defined in| aa78d849_32e0_cbe3_8323_1a62fafa0824 29642893_8161_6aea_c2a0_1e0489c459e9["on_llm_new_token()"] ec453097_9dd7_3edc_a70f_a6b65b8f3e63 -->|method| 29642893_8161_6aea_c2a0_1e0489c459e9 e995d4d2_4cb7_64ab_42a5_c0f44dbdb5c9["on_llm_end()"] ec453097_9dd7_3edc_a70f_a6b65b8f3e63 -->|method| e995d4d2_4cb7_64ab_42a5_c0f44dbdb5c9 f38ee5d0_fc91_356d_6126_8f66a582b7ee["on_llm_error()"] ec453097_9dd7_3edc_a70f_a6b65b8f3e63 -->|method| f38ee5d0_fc91_356d_6126_8f66a582b7ee
Relationship Graph
Source Code
libs/core/langchain_core/callbacks/base.py lines 61–125
class LLMManagerMixin:
"""Mixin for LLM callbacks."""
def on_llm_new_token(
self,
token: str,
*,
chunk: GenerationChunk | ChatGenerationChunk | None = None,
run_id: UUID,
parent_run_id: UUID | None = None,
tags: list[str] | None = None,
**kwargs: Any,
) -> Any:
"""Run on new output token.
Only available when streaming is enabled.
For both chat models and non-chat models (legacy text completion LLMs).
Args:
token: The new token.
chunk: The new generated chunk, containing content and other information.
run_id: The ID of the current run.
parent_run_id: The ID of the parent run.
tags: The tags.
**kwargs: Additional keyword arguments.
"""
def on_llm_end(
self,
response: LLMResult,
*,
run_id: UUID,
parent_run_id: UUID | None = None,
tags: list[str] | None = None,
**kwargs: Any,
) -> Any:
"""Run when LLM ends running.
Args:
response: The response which was generated.
run_id: The ID of the current run.
parent_run_id: The ID of the parent run.
tags: The tags.
**kwargs: Additional keyword arguments.
"""
def on_llm_error(
self,
error: BaseException,
*,
run_id: UUID,
parent_run_id: UUID | None = None,
tags: list[str] | None = None,
**kwargs: Any,
) -> Any:
"""Run when LLM errors.
Args:
error: The error that occurred.
run_id: The ID of the current run.
parent_run_id: The ID of the parent run.
tags: The tags.
**kwargs: Additional keyword arguments.
"""
Defined In
Source
Frequently Asked Questions
What is the LLMManagerMixin class?
LLMManagerMixin is a class in the langchain codebase, defined in libs/core/langchain_core/callbacks/base.py.
Where is LLMManagerMixin defined?
LLMManagerMixin is defined in libs/core/langchain_core/callbacks/base.py at line 61.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free