Home / Function/ _astream() — langchain Function Reference

_astream() — langchain Function Reference

Architecture documentation for the _astream() function in llms.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  3f0a5215_29d7_44ca_e135_a3f37294ae0c["_astream()"]
  c95a497f_938f_2be9_842e_087a0766cf00["AnthropicLLM"]
  3f0a5215_29d7_44ca_e135_a3f37294ae0c -->|defined in| c95a497f_938f_2be9_842e_087a0766cf00
  40512e45_7b9d_dbee_2106_59fc9edca16c["_acall()"]
  40512e45_7b9d_dbee_2106_59fc9edca16c -->|calls| 3f0a5215_29d7_44ca_e135_a3f37294ae0c
  5b0cfa0b_130d_960a_411e_cc97a2c28a93["_get_anthropic_stop()"]
  3f0a5215_29d7_44ca_e135_a3f37294ae0c -->|calls| 5b0cfa0b_130d_960a_411e_cc97a2c28a93
  04edfbcc_f913_d95a_6261_c2a81f9f570e["_format_messages()"]
  3f0a5215_29d7_44ca_e135_a3f37294ae0c -->|calls| 04edfbcc_f913_d95a_6261_c2a81f9f570e
  style 3f0a5215_29d7_44ca_e135_a3f37294ae0c fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

libs/partners/anthropic/langchain_anthropic/llms.py lines 379–422

    async def _astream(
        self,
        prompt: str,
        stop: list[str] | None = None,
        run_manager: AsyncCallbackManagerForLLMRun | None = None,
        **kwargs: Any,
    ) -> AsyncIterator[GenerationChunk]:
        r"""Call Anthropic completion_stream and return the resulting generator.

        Args:
            prompt: The prompt to pass into the model.
            stop: Optional list of stop words to use when generating.
            run_manager: Optional callback manager for LLM run.
            kwargs: Additional keyword arguments to pass to the model.

        Returns:
            A generator representing the stream of tokens from Anthropic.

        Example:
            ```python
            prompt = "Write a poem about a stream."
            prompt = f"\n\nHuman: {prompt}\n\nAssistant:"
            generator = anthropic.stream(prompt)
            for token in generator:
                yield token
            ```
        """
        stop = self._get_anthropic_stop(stop)
        params = {**self._default_params, **kwargs}

        # Remove parameters not supported by Messages API
        params = {k: v for k, v in params.items() if k != "max_tokens_to_sample"}

        async with self.async_client.messages.stream(
            messages=self._format_messages(prompt),
            stop_sequences=stop if stop else None,
            **params,
        ) as stream:
            async for event in stream:
                if event.type == "content_block_delta" and hasattr(event.delta, "text"):
                    chunk = GenerationChunk(text=event.delta.text)
                    if run_manager:
                        await run_manager.on_llm_new_token(chunk.text, chunk=chunk)
                    yield chunk

Domain

Subdomains

Called By

Frequently Asked Questions

What does _astream() do?
_astream() is a function in the langchain codebase, defined in libs/partners/anthropic/langchain_anthropic/llms.py.
Where is _astream() defined?
_astream() is defined in libs/partners/anthropic/langchain_anthropic/llms.py at line 379.
What does _astream() call?
_astream() calls 2 function(s): _format_messages, _get_anthropic_stop.
What calls _astream()?
_astream() is called by 1 function(s): _acall.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free