on_llm_start() — langchain Function Reference
Architecture documentation for the on_llm_start() function in streaming_aiter_final_only.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 1cfad502_36e6_46e8_0855_fee8edf85427["on_llm_start()"] 8ff14092_eef3_abb2_55fb_ca2a28f79a83["AsyncFinalIteratorCallbackHandler"] 1cfad502_36e6_46e8_0855_fee8edf85427 -->|defined in| 8ff14092_eef3_abb2_55fb_ca2a28f79a83 style 1cfad502_36e6_46e8_0855_fee8edf85427 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/langchain/langchain_classic/callbacks/streaming_aiter_final_only.py lines 68–76
async def on_llm_start(
self,
serialized: dict[str, Any],
prompts: list[str],
**kwargs: Any,
) -> None:
# If two calls are made in a row, this resets the state
self.done.clear()
self.answer_reached = False
Domain
Subdomains
Source
Frequently Asked Questions
What does on_llm_start() do?
on_llm_start() is a function in the langchain codebase, defined in libs/langchain/langchain_classic/callbacks/streaming_aiter_final_only.py.
Where is on_llm_start() defined?
on_llm_start() is defined in libs/langchain/langchain_classic/callbacks/streaming_aiter_final_only.py at line 68.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free