_on_llm_start() — langchain Function Reference
Architecture documentation for the _on_llm_start() function in stdout.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 3847ea36_9090_74f7_44f0_b0626355ef3b["_on_llm_start()"] 56d9fac7_db71_9c71_7cac_1d680d392fa0["FunctionCallbackHandler"] 3847ea36_9090_74f7_44f0_b0626355ef3b -->|defined in| 56d9fac7_db71_9c71_7cac_1d680d392fa0 1bda6774_d332_dec5_ed8f_8d6eef57961e["get_breadcrumbs()"] 3847ea36_9090_74f7_44f0_b0626355ef3b -->|calls| 1bda6774_d332_dec5_ed8f_8d6eef57961e 95875f96_0c2d_b364_e63a_651f8ed4cdcd["try_json_stringify()"] 3847ea36_9090_74f7_44f0_b0626355ef3b -->|calls| 95875f96_0c2d_b364_e63a_651f8ed4cdcd style 3847ea36_9090_74f7_44f0_b0626355ef3b fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/core/langchain_core/tracers/stdout.py lines 136–147
def _on_llm_start(self, run: Run) -> None:
crumbs = self.get_breadcrumbs(run)
inputs = (
{"prompts": [p.strip() for p in run.inputs["prompts"]]}
if "prompts" in run.inputs
else run.inputs
)
self.function_callback(
f"{get_colored_text('[llm/start]', color='green')} "
+ get_bolded_text(f"[{crumbs}] Entering LLM run with input:\n")
+ f"{try_json_stringify(inputs, '[inputs]')}"
)
Domain
Subdomains
Defined In
Source
Frequently Asked Questions
What does _on_llm_start() do?
_on_llm_start() is a function in the langchain codebase, defined in libs/core/langchain_core/tracers/stdout.py.
Where is _on_llm_start() defined?
_on_llm_start() is defined in libs/core/langchain_core/tracers/stdout.py at line 136.
What does _on_llm_start() call?
_on_llm_start() calls 2 function(s): get_breadcrumbs, try_json_stringify.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free