stream() — langchain Function Reference
Architecture documentation for the stream() function in llms.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD c1366477_d684_c841_c3d0_845479ea9a84["stream()"] ce4aa464_3868_179e_5d99_df48bc307c5f["BaseLLM"] c1366477_d684_c841_c3d0_845479ea9a84 -->|defined in| ce4aa464_3868_179e_5d99_df48bc307c5f ca82d7c2_13eb_a458_db91_e70859a9fdaa["invoke()"] c1366477_d684_c841_c3d0_845479ea9a84 -->|calls| ca82d7c2_13eb_a458_db91_e70859a9fdaa 38007078_b49e_18a9_9c2b_bb8361fa3fb3["_convert_input()"] c1366477_d684_c841_c3d0_845479ea9a84 -->|calls| 38007078_b49e_18a9_9c2b_bb8361fa3fb3 b4a028e5_e42e_3478_739f_03ee8ab9100d["dict()"] c1366477_d684_c841_c3d0_845479ea9a84 -->|calls| b4a028e5_e42e_3478_739f_03ee8ab9100d 4573eeb7_3c0a_7ca6_23d7_ed5efc46fdb1["_get_ls_params()"] c1366477_d684_c841_c3d0_845479ea9a84 -->|calls| 4573eeb7_3c0a_7ca6_23d7_ed5efc46fdb1 35910542_1b5c_a95a_2115_9945398761d0["_stream()"] c1366477_d684_c841_c3d0_845479ea9a84 -->|calls| 35910542_1b5c_a95a_2115_9945398761d0 style c1366477_d684_c841_c3d0_845479ea9a84 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/core/langchain_core/language_models/llms.py lines 508–572
def stream(
self,
input: LanguageModelInput,
config: RunnableConfig | None = None,
*,
stop: list[str] | None = None,
**kwargs: Any,
) -> Iterator[str]:
if type(self)._stream == BaseLLM._stream: # noqa: SLF001
# model doesn't implement streaming, so use default implementation
yield self.invoke(input, config=config, stop=stop, **kwargs)
else:
prompt = self._convert_input(input).to_string()
config = ensure_config(config)
params = self.dict()
params["stop"] = stop
params = {**params, **kwargs}
options = {"stop": stop}
inheritable_metadata = {
**(config.get("metadata") or {}),
**self._get_ls_params(stop=stop, **kwargs),
}
callback_manager = CallbackManager.configure(
config.get("callbacks"),
self.callbacks,
self.verbose,
config.get("tags"),
self.tags,
inheritable_metadata,
self.metadata,
)
(run_manager,) = callback_manager.on_llm_start(
self._serialized,
[prompt],
invocation_params=params,
options=options,
name=config.get("run_name"),
run_id=config.pop("run_id", None),
batch_size=1,
)
generation: GenerationChunk | None = None
try:
for chunk in self._stream(
prompt, stop=stop, run_manager=run_manager, **kwargs
):
yield chunk.text
if generation is None:
generation = chunk
else:
generation += chunk
except BaseException as e:
run_manager.on_llm_error(
e,
response=LLMResult(
generations=[[generation]] if generation else []
),
)
raise
if generation is None:
err = ValueError("No generation chunks were returned")
run_manager.on_llm_error(err, response=LLMResult(generations=[]))
raise err
run_manager.on_llm_end(LLMResult(generations=[[generation]]))
Domain
Subdomains
Source
Frequently Asked Questions
What does stream() do?
stream() is a function in the langchain codebase, defined in libs/core/langchain_core/language_models/llms.py.
Where is stream() defined?
stream() is defined in libs/core/langchain_core/language_models/llms.py at line 508.
What does stream() call?
stream() calls 5 function(s): _convert_input, _get_ls_params, _stream, dict, invoke.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free