on_llm_start() — langchain Function Reference
Architecture documentation for the on_llm_start() function in streaming_stdout.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD eaa769c1_d32a_ef50_fa66_f5810a38668d["on_llm_start()"] 6b65bf57_0fa9_b411_5886_294d6dbe5842["StreamingStdOutCallbackHandler"] eaa769c1_d32a_ef50_fa66_f5810a38668d -->|defined in| 6b65bf57_0fa9_b411_5886_294d6dbe5842 style eaa769c1_d32a_ef50_fa66_f5810a38668d fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/core/langchain_core/callbacks/streaming_stdout.py lines 24–33
def on_llm_start(
self, serialized: dict[str, Any], prompts: list[str], **kwargs: Any
) -> None:
"""Run when LLM starts running.
Args:
serialized: The serialized LLM.
prompts: The prompts to run.
**kwargs: Additional keyword arguments.
"""
Domain
Subdomains
Source
Frequently Asked Questions
What does on_llm_start() do?
on_llm_start() is a function in the langchain codebase, defined in libs/core/langchain_core/callbacks/streaming_stdout.py.
Where is on_llm_start() defined?
on_llm_start() is defined in libs/core/langchain_core/callbacks/streaming_stdout.py at line 24.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free