_stream() — langchain Function Reference
Architecture documentation for the _stream() function in llms.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 35910542_1b5c_a95a_2115_9945398761d0["_stream()"] ce4aa464_3868_179e_5d99_df48bc307c5f["BaseLLM"] 35910542_1b5c_a95a_2115_9945398761d0 -->|defined in| ce4aa464_3868_179e_5d99_df48bc307c5f c1366477_d684_c841_c3d0_845479ea9a84["stream()"] c1366477_d684_c841_c3d0_845479ea9a84 -->|calls| 35910542_1b5c_a95a_2115_9945398761d0 style 35910542_1b5c_a95a_2115_9945398761d0 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/core/langchain_core/language_models/llms.py lines 702–731
def _stream(
self,
prompt: str,
stop: list[str] | None = None,
run_manager: CallbackManagerForLLMRun | None = None,
**kwargs: Any,
) -> Iterator[GenerationChunk]:
"""Stream the LLM on the given prompt.
This method should be overridden by subclasses that support streaming.
If not implemented, the default behavior of calls to stream will be to
fallback to the non-streaming version of the model and return
the output as a single chunk.
Args:
prompt: The prompt to generate from.
stop: Stop words to use when generating.
Model output is cut off at the first occurrence of any of these
substrings.
run_manager: Callback manager for the run.
**kwargs: Arbitrary additional keyword arguments.
These are usually passed to the model provider API call.
Yields:
Generation chunks.
"""
raise NotImplementedError
Domain
Subdomains
Called By
Source
Frequently Asked Questions
What does _stream() do?
_stream() is a function in the langchain codebase, defined in libs/core/langchain_core/language_models/llms.py.
Where is _stream() defined?
_stream() is defined in libs/core/langchain_core/language_models/llms.py at line 702.
What calls _stream()?
_stream() is called by 1 function(s): stream.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free