streaming_stdout_final_only.py — langchain Source File
Architecture documentation for streaming_stdout_final_only.py, a python file in the langchain codebase. 4 imports, 0 dependents.
Entity Profile
Dependency Diagram
graph LR 438e5230_3337_758a_8d8f_93968317fa8a["streaming_stdout_final_only.py"] 02625e10_fb78_7ecd_1ee2_105ee470faf5["sys"] 438e5230_3337_758a_8d8f_93968317fa8a --> 02625e10_fb78_7ecd_1ee2_105ee470faf5 feec1ec4_6917_867b_d228_b134d0ff8099["typing"] 438e5230_3337_758a_8d8f_93968317fa8a --> feec1ec4_6917_867b_d228_b134d0ff8099 17a62cb3_fefd_6320_b757_b53bb4a1c661["langchain_core.callbacks"] 438e5230_3337_758a_8d8f_93968317fa8a --> 17a62cb3_fefd_6320_b757_b53bb4a1c661 f85fae70_1011_eaec_151c_4083140ae9e5["typing_extensions"] 438e5230_3337_758a_8d8f_93968317fa8a --> f85fae70_1011_eaec_151c_4083140ae9e5 style 438e5230_3337_758a_8d8f_93968317fa8a fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
"""Callback Handler streams to stdout on new llm token."""
import sys
from typing import Any
from langchain_core.callbacks import StreamingStdOutCallbackHandler
from typing_extensions import override
DEFAULT_ANSWER_PREFIX_TOKENS = ["Final", "Answer", ":"]
class FinalStreamingStdOutCallbackHandler(StreamingStdOutCallbackHandler):
"""Callback handler for streaming in agents.
Only works with agents using LLMs that support streaming.
Only the final output of the agent will be streamed.
"""
def append_to_last_tokens(self, token: str) -> None:
"""Append token to the last tokens."""
self.last_tokens.append(token)
self.last_tokens_stripped.append(token.strip())
if len(self.last_tokens) > len(self.answer_prefix_tokens):
self.last_tokens.pop(0)
self.last_tokens_stripped.pop(0)
def check_if_answer_reached(self) -> bool:
"""Check if the answer has been reached."""
if self.strip_tokens:
return self.last_tokens_stripped == self.answer_prefix_tokens_stripped
return self.last_tokens == self.answer_prefix_tokens
def __init__(
self,
*,
answer_prefix_tokens: list[str] | None = None,
strip_tokens: bool = True,
stream_prefix: bool = False,
) -> None:
"""Instantiate FinalStreamingStdOutCallbackHandler.
Args:
answer_prefix_tokens: Token sequence that prefixes the answer.
Default is ["Final", "Answer", ":"]
strip_tokens: Ignore white spaces and new lines when comparing
answer_prefix_tokens to last tokens? (to determine if answer has been
reached)
stream_prefix: Should answer prefix itself also be streamed?
"""
super().__init__()
if answer_prefix_tokens is None:
self.answer_prefix_tokens = DEFAULT_ANSWER_PREFIX_TOKENS
else:
self.answer_prefix_tokens = answer_prefix_tokens
if strip_tokens:
self.answer_prefix_tokens_stripped = [
token.strip() for token in self.answer_prefix_tokens
]
else:
self.answer_prefix_tokens_stripped = self.answer_prefix_tokens
self.last_tokens = [""] * len(self.answer_prefix_tokens)
self.last_tokens_stripped = [""] * len(self.answer_prefix_tokens)
self.strip_tokens = strip_tokens
self.stream_prefix = stream_prefix
self.answer_reached = False
@override
def on_llm_start(
self,
serialized: dict[str, Any],
prompts: list[str],
**kwargs: Any,
) -> None:
"""Run when LLM starts running."""
self.answer_reached = False
@override
def on_llm_new_token(self, token: str, **kwargs: Any) -> None:
"""Run on new LLM token. Only available when streaming is enabled."""
# Remember the last n tokens, where n = len(answer_prefix_tokens)
self.append_to_last_tokens(token)
# Check if the last n tokens match the answer_prefix_tokens list ...
if self.check_if_answer_reached():
self.answer_reached = True
if self.stream_prefix:
for t in self.last_tokens:
sys.stdout.write(t)
sys.stdout.flush()
return
# ... if yes, then print tokens from now on
if self.answer_reached:
sys.stdout.write(token)
sys.stdout.flush()
Domain
Subdomains
Dependencies
- langchain_core.callbacks
- sys
- typing
- typing_extensions
Source
Frequently Asked Questions
What does streaming_stdout_final_only.py do?
streaming_stdout_final_only.py is a source file in the langchain codebase, written in python. It belongs to the LangChainCore domain, MessageInterface subdomain.
What does streaming_stdout_final_only.py depend on?
streaming_stdout_final_only.py imports 4 module(s): langchain_core.callbacks, sys, typing, typing_extensions.
Where is streaming_stdout_final_only.py in the architecture?
streaming_stdout_final_only.py is located at libs/langchain/langchain_classic/callbacks/streaming_stdout_final_only.py (domain: LangChainCore, subdomain: MessageInterface, directory: libs/langchain/langchain_classic/callbacks).
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free