Home / Function/ _stream() — langchain Function Reference

_stream() — langchain Function Reference

Architecture documentation for the _stream() function in llms.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  cf3f81e3_6c76_6f68_9dff_ca2ade12ec2f["_stream()"]
  6220540a_4afa_4d39_cdb6_bb2f4e26fe5f["OllamaLLM"]
  cf3f81e3_6c76_6f68_9dff_ca2ade12ec2f -->|defined in| 6220540a_4afa_4d39_cdb6_bb2f4e26fe5f
  5a56cfec_c901_d53f_c482_f32ccd466460["_create_generate_stream()"]
  cf3f81e3_6c76_6f68_9dff_ca2ade12ec2f -->|calls| 5a56cfec_c901_d53f_c482_f32ccd466460
  style cf3f81e3_6c76_6f68_9dff_ca2ade12ec2f fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

libs/partners/ollama/langchain_ollama/llms.py lines 489–518

    def _stream(
        self,
        prompt: str,
        stop: list[str] | None = None,
        run_manager: CallbackManagerForLLMRun | None = None,
        **kwargs: Any,
    ) -> Iterator[GenerationChunk]:
        reasoning = kwargs.get("reasoning", self.reasoning)
        for stream_resp in self._create_generate_stream(prompt, stop, **kwargs):
            if not isinstance(stream_resp, str):
                additional_kwargs = {}
                if reasoning and (thinking_content := stream_resp.get("thinking")):
                    additional_kwargs["reasoning_content"] = thinking_content

                chunk = GenerationChunk(
                    text=(stream_resp.get("response", "")),
                    generation_info={
                        "finish_reason": self.stop,
                        **additional_kwargs,
                        **(
                            dict(stream_resp) if stream_resp.get("done") is True else {}
                        ),
                    },
                )
                if run_manager:
                    run_manager.on_llm_new_token(
                        chunk.text,
                        verbose=self.verbose,
                    )
                yield chunk

Domain

Subdomains

Frequently Asked Questions

What does _stream() do?
_stream() is a function in the langchain codebase, defined in libs/partners/ollama/langchain_ollama/llms.py.
Where is _stream() defined?
_stream() is defined in libs/partners/ollama/langchain_ollama/llms.py at line 489.
What does _stream() call?
_stream() calls 1 function(s): _create_generate_stream.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free