on_llm_start() — langchain Function Reference
Architecture documentation for the on_llm_start() function in streaming_stdout_final_only.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 6975d980_8638_2ffa_419c_fca08016e68f["on_llm_start()"] f95e695f_57c0_a0ed_1f28_1ea7c2b7584d["FinalStreamingStdOutCallbackHandler"] 6975d980_8638_2ffa_419c_fca08016e68f -->|defined in| f95e695f_57c0_a0ed_1f28_1ea7c2b7584d style 6975d980_8638_2ffa_419c_fca08016e68f fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/langchain/langchain_classic/callbacks/streaming_stdout_final_only.py lines 69–76
def on_llm_start(
self,
serialized: dict[str, Any],
prompts: list[str],
**kwargs: Any,
) -> None:
"""Run when LLM starts running."""
self.answer_reached = False
Domain
Subdomains
Source
Frequently Asked Questions
What does on_llm_start() do?
on_llm_start() is a function in the langchain codebase, defined in libs/langchain/langchain_classic/callbacks/streaming_stdout_final_only.py.
Where is on_llm_start() defined?
on_llm_start() is defined in libs/langchain/langchain_classic/callbacks/streaming_stdout_final_only.py at line 69.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free