Home / Function/ _stream() — langchain Function Reference

_stream() — langchain Function Reference

Architecture documentation for the _stream() function in chat_models.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  835c729a_aa91_126c_1e8e_2eca41c70df1["_stream()"]
  d5ca3c3a_3c29_0cb2_a156_35c92a31f5fd["ChatGroq"]
  835c729a_aa91_126c_1e8e_2eca41c70df1 -->|defined in| d5ca3c3a_3c29_0cb2_a156_35c92a31f5fd
  af2bf0e7_66f5_c6a1_a664_d03828bcbf1b["_generate()"]
  af2bf0e7_66f5_c6a1_a664_d03828bcbf1b -->|calls| 835c729a_aa91_126c_1e8e_2eca41c70df1
  4149b891_9f6b_d1d3_0159_d2a1de034bf3["_create_message_dicts()"]
  835c729a_aa91_126c_1e8e_2eca41c70df1 -->|calls| 4149b891_9f6b_d1d3_0159_d2a1de034bf3
  943d2ee7_1f7a_632d_e35f_f168f64894fe["_convert_chunk_to_message_chunk()"]
  835c729a_aa91_126c_1e8e_2eca41c70df1 -->|calls| 943d2ee7_1f7a_632d_e35f_f168f64894fe
  style 835c729a_aa91_126c_1e8e_2eca41c70df1 fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

libs/partners/groq/langchain_groq/chat_models.py lines 645–695

    def _stream(
        self,
        messages: list[BaseMessage],
        stop: list[str] | None = None,
        run_manager: CallbackManagerForLLMRun | None = None,
        **kwargs: Any,
    ) -> Iterator[ChatGenerationChunk]:
        message_dicts, params = self._create_message_dicts(messages, stop)

        params = {**params, **kwargs, "stream": True}

        default_chunk_class: type[BaseMessageChunk] = AIMessageChunk
        for chunk in self.client.create(messages=message_dicts, **params):
            if not isinstance(chunk, dict):
                chunk = chunk.model_dump()  # noqa: PLW2901
            if len(chunk["choices"]) == 0:
                continue
            choice = chunk["choices"][0]
            message_chunk = _convert_chunk_to_message_chunk(chunk, default_chunk_class)
            generation_info = {}
            if finish_reason := choice.get("finish_reason"):
                generation_info["finish_reason"] = finish_reason
                generation_info["model_name"] = self.model_name
                if system_fingerprint := chunk.get("system_fingerprint"):
                    generation_info["system_fingerprint"] = system_fingerprint
                service_tier = params.get("service_tier") or self.service_tier
                generation_info["service_tier"] = service_tier
                reasoning_effort = (
                    params.get("reasoning_effort") or self.reasoning_effort
                )
                if reasoning_effort:
                    generation_info["reasoning_effort"] = reasoning_effort
            logprobs = choice.get("logprobs")
            if logprobs:
                generation_info["logprobs"] = logprobs

            if generation_info:
                message_chunk = message_chunk.model_copy(
                    update={"response_metadata": generation_info}
                )

            default_chunk_class = message_chunk.__class__
            generation_chunk = ChatGenerationChunk(
                message=message_chunk, generation_info=generation_info or None
            )

            if run_manager:
                run_manager.on_llm_new_token(
                    generation_chunk.text, chunk=generation_chunk, logprobs=logprobs
                )
            yield generation_chunk

Domain

Subdomains

Called By

Frequently Asked Questions

What does _stream() do?
_stream() is a function in the langchain codebase, defined in libs/partners/groq/langchain_groq/chat_models.py.
Where is _stream() defined?
_stream() is defined in libs/partners/groq/langchain_groq/chat_models.py at line 645.
What does _stream() call?
_stream() calls 2 function(s): _convert_chunk_to_message_chunk, _create_message_dicts.
What calls _stream()?
_stream() is called by 1 function(s): _generate.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free