streaming_stdout.py — langchain Source File
Architecture documentation for streaming_stdout.py, a python file in the langchain codebase. 7 imports, 0 dependents.
Entity Profile
Dependency Diagram
graph LR a52ec775_6c97_50be_3dd0_8e0eae9f9e36["streaming_stdout.py"] d76a28c2_c3ab_00a8_5208_77807a49449d["sys"] a52ec775_6c97_50be_3dd0_8e0eae9f9e36 --> d76a28c2_c3ab_00a8_5208_77807a49449d 8e2034b7_ceb8_963f_29fc_2ea6b50ef9b3["typing"] a52ec775_6c97_50be_3dd0_8e0eae9f9e36 --> 8e2034b7_ceb8_963f_29fc_2ea6b50ef9b3 91721f45_4909_e489_8c1f_084f8bd87145["typing_extensions"] a52ec775_6c97_50be_3dd0_8e0eae9f9e36 --> 91721f45_4909_e489_8c1f_084f8bd87145 7e64d143_ea36_1c73_4897_1d0ae1757b5b["langchain_core.callbacks.base"] a52ec775_6c97_50be_3dd0_8e0eae9f9e36 --> 7e64d143_ea36_1c73_4897_1d0ae1757b5b 80d582c5_7cc3_ac96_2742_3dbe1cbd4e2b["langchain_core.agents"] a52ec775_6c97_50be_3dd0_8e0eae9f9e36 --> 80d582c5_7cc3_ac96_2742_3dbe1cbd4e2b d758344f_537f_649e_f467_b9d7442e86df["langchain_core.messages"] a52ec775_6c97_50be_3dd0_8e0eae9f9e36 --> d758344f_537f_649e_f467_b9d7442e86df ac2a9b92_4484_491e_1b48_ec85e71e1d58["langchain_core.outputs"] a52ec775_6c97_50be_3dd0_8e0eae9f9e36 --> ac2a9b92_4484_491e_1b48_ec85e71e1d58 style a52ec775_6c97_50be_3dd0_8e0eae9f9e36 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
"""Callback Handler streams to stdout on new llm token."""
from __future__ import annotations
import sys
from typing import TYPE_CHECKING, Any
from typing_extensions import override
from langchain_core.callbacks.base import BaseCallbackHandler
if TYPE_CHECKING:
from langchain_core.agents import AgentAction, AgentFinish
from langchain_core.messages import BaseMessage
from langchain_core.outputs import LLMResult
class StreamingStdOutCallbackHandler(BaseCallbackHandler):
"""Callback handler for streaming.
!!! warning "Only works with LLMs that support streaming."
"""
def on_llm_start(
self, serialized: dict[str, Any], prompts: list[str], **kwargs: Any
) -> None:
"""Run when LLM starts running.
Args:
serialized: The serialized LLM.
prompts: The prompts to run.
**kwargs: Additional keyword arguments.
"""
def on_chat_model_start(
self,
serialized: dict[str, Any],
messages: list[list[BaseMessage]],
**kwargs: Any,
) -> None:
"""Run when LLM starts running.
Args:
serialized: The serialized LLM.
messages: The messages to run.
**kwargs: Additional keyword arguments.
"""
@override
def on_llm_new_token(self, token: str, **kwargs: Any) -> None:
"""Run on new LLM token. Only available when streaming is enabled.
Args:
token: The new token.
**kwargs: Additional keyword arguments.
"""
sys.stdout.write(token)
sys.stdout.flush()
def on_llm_end(self, response: LLMResult, **kwargs: Any) -> None:
// ... (93 more lines)
Domain
Subdomains
Functions
Classes
Dependencies
- langchain_core.agents
- langchain_core.callbacks.base
- langchain_core.messages
- langchain_core.outputs
- sys
- typing
- typing_extensions
Source
Frequently Asked Questions
What does streaming_stdout.py do?
streaming_stdout.py is a source file in the langchain codebase, written in python. It belongs to the CoreAbstractions domain, Serialization subdomain.
What functions are defined in streaming_stdout.py?
streaming_stdout.py defines 1 function(s): langchain_core.
What does streaming_stdout.py depend on?
streaming_stdout.py imports 7 module(s): langchain_core.agents, langchain_core.callbacks.base, langchain_core.messages, langchain_core.outputs, sys, typing, typing_extensions.
Where is streaming_stdout.py in the architecture?
streaming_stdout.py is located at libs/core/langchain_core/callbacks/streaming_stdout.py (domain: CoreAbstractions, subdomain: Serialization, directory: libs/core/langchain_core/callbacks).
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free