_stream() — langchain Function Reference
Architecture documentation for the _stream() function in llms.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD cb1cae7d_4a5d_cfbf_994a_88075fe175e2["_stream()"] c95a497f_938f_2be9_842e_087a0766cf00["AnthropicLLM"] cb1cae7d_4a5d_cfbf_994a_88075fe175e2 -->|defined in| c95a497f_938f_2be9_842e_087a0766cf00 e1e4f815_3e2a_75f8_675e_9a5e7f0f868a["_call()"] e1e4f815_3e2a_75f8_675e_9a5e7f0f868a -->|calls| cb1cae7d_4a5d_cfbf_994a_88075fe175e2 5b0cfa0b_130d_960a_411e_cc97a2c28a93["_get_anthropic_stop()"] cb1cae7d_4a5d_cfbf_994a_88075fe175e2 -->|calls| 5b0cfa0b_130d_960a_411e_cc97a2c28a93 04edfbcc_f913_d95a_6261_c2a81f9f570e["_format_messages()"] cb1cae7d_4a5d_cfbf_994a_88075fe175e2 -->|calls| 04edfbcc_f913_d95a_6261_c2a81f9f570e style cb1cae7d_4a5d_cfbf_994a_88075fe175e2 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/anthropic/langchain_anthropic/llms.py lines 334–377
def _stream(
self,
prompt: str,
stop: list[str] | None = None,
run_manager: CallbackManagerForLLMRun | None = None,
**kwargs: Any,
) -> Iterator[GenerationChunk]:
r"""Call Anthropic completion_stream and return the resulting generator.
Args:
prompt: The prompt to pass into the model.
stop: Optional list of stop words to use when generating.
run_manager: Optional callback manager for LLM run.
kwargs: Additional keyword arguments to pass to the model.
Returns:
A generator representing the stream of tokens from Anthropic.
Example:
```python
prompt = "Write a poem about a stream."
prompt = f"\n\nHuman: {prompt}\n\nAssistant:"
generator = anthropic.stream(prompt)
for token in generator:
yield token
```
"""
stop = self._get_anthropic_stop(stop)
params = {**self._default_params, **kwargs}
# Remove parameters not supported by Messages API
params = {k: v for k, v in params.items() if k != "max_tokens_to_sample"}
with self.client.messages.stream(
messages=self._format_messages(prompt),
stop_sequences=stop if stop else None,
**params,
) as stream:
for event in stream:
if event.type == "content_block_delta" and hasattr(event.delta, "text"):
chunk = GenerationChunk(text=event.delta.text)
if run_manager:
run_manager.on_llm_new_token(chunk.text, chunk=chunk)
yield chunk
Domain
Subdomains
Called By
Source
Frequently Asked Questions
What does _stream() do?
_stream() is a function in the langchain codebase, defined in libs/partners/anthropic/langchain_anthropic/llms.py.
Where is _stream() defined?
_stream() is defined in libs/partners/anthropic/langchain_anthropic/llms.py at line 334.
What does _stream() call?
_stream() calls 2 function(s): _format_messages, _get_anthropic_stop.
What calls _stream()?
_stream() is called by 1 function(s): _call.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free