Home / Function/ on_chat_model_start() — langchain Function Reference

on_chat_model_start() — langchain Function Reference

Architecture documentation for the on_chat_model_start() function in streaming_stdout.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  96d82090_b515_98e4_ce10_04ed51dd0d08["on_chat_model_start()"]
  6b65bf57_0fa9_b411_5886_294d6dbe5842["StreamingStdOutCallbackHandler"]
  96d82090_b515_98e4_ce10_04ed51dd0d08 -->|defined in| 6b65bf57_0fa9_b411_5886_294d6dbe5842
  style 96d82090_b515_98e4_ce10_04ed51dd0d08 fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

libs/core/langchain_core/callbacks/streaming_stdout.py lines 35–47

    def on_chat_model_start(
        self,
        serialized: dict[str, Any],
        messages: list[list[BaseMessage]],
        **kwargs: Any,
    ) -> None:
        """Run when LLM starts running.

        Args:
            serialized: The serialized LLM.
            messages: The messages to run.
            **kwargs: Additional keyword arguments.
        """

Domain

Subdomains

Frequently Asked Questions

What does on_chat_model_start() do?
on_chat_model_start() is a function in the langchain codebase, defined in libs/core/langchain_core/callbacks/streaming_stdout.py.
Where is on_chat_model_start() defined?
on_chat_model_start() is defined in libs/core/langchain_core/callbacks/streaming_stdout.py at line 35.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free