Home / Function/ _stream() — langchain Function Reference

_stream() — langchain Function Reference

Architecture documentation for the _stream() function in chat_models.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  2c99d0d8_7ba8_b5ed_0b85_d9060b83b85d["_stream()"]
  1a5cd25a_9420_c6b2_ec8d_2b53c6427514["ChatFireworks"]
  2c99d0d8_7ba8_b5ed_0b85_d9060b83b85d -->|defined in| 1a5cd25a_9420_c6b2_ec8d_2b53c6427514
  12942fee_5eb3_0410_27b1_0d94ecd994a1["_generate()"]
  12942fee_5eb3_0410_27b1_0d94ecd994a1 -->|calls| 2c99d0d8_7ba8_b5ed_0b85_d9060b83b85d
  db2ab649_5bf1_261f_34a0_aed78a625fdf["_create_message_dicts()"]
  2c99d0d8_7ba8_b5ed_0b85_d9060b83b85d -->|calls| db2ab649_5bf1_261f_34a0_aed78a625fdf
  def2bd88_9f61_77d1_2fcb_e9dad1281a22["_convert_chunk_to_message_chunk()"]
  2c99d0d8_7ba8_b5ed_0b85_d9060b83b85d -->|calls| def2bd88_9f61_77d1_2fcb_e9dad1281a22
  style 2c99d0d8_7ba8_b5ed_0b85_d9060b83b85d fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

libs/partners/fireworks/langchain_fireworks/chat_models.py lines 483–516

    def _stream(
        self,
        messages: list[BaseMessage],
        stop: list[str] | None = None,
        run_manager: CallbackManagerForLLMRun | None = None,
        **kwargs: Any,
    ) -> Iterator[ChatGenerationChunk]:
        message_dicts, params = self._create_message_dicts(messages, stop)
        params = {**params, **kwargs, "stream": True}

        default_chunk_class: type[BaseMessageChunk] = AIMessageChunk
        for chunk in self.client.create(messages=message_dicts, **params):
            if not isinstance(chunk, dict):
                chunk = chunk.model_dump()
            if len(chunk["choices"]) == 0:
                continue
            choice = chunk["choices"][0]
            message_chunk = _convert_chunk_to_message_chunk(chunk, default_chunk_class)
            generation_info = {}
            if finish_reason := choice.get("finish_reason"):
                generation_info["finish_reason"] = finish_reason
                generation_info["model_name"] = self.model_name
            logprobs = choice.get("logprobs")
            if logprobs:
                generation_info["logprobs"] = logprobs
            default_chunk_class = message_chunk.__class__
            generation_chunk = ChatGenerationChunk(
                message=message_chunk, generation_info=generation_info or None
            )
            if run_manager:
                run_manager.on_llm_new_token(
                    generation_chunk.text, chunk=generation_chunk, logprobs=logprobs
                )
            yield generation_chunk

Domain

Subdomains

Called By

Frequently Asked Questions

What does _stream() do?
_stream() is a function in the langchain codebase, defined in libs/partners/fireworks/langchain_fireworks/chat_models.py.
Where is _stream() defined?
_stream() is defined in libs/partners/fireworks/langchain_fireworks/chat_models.py at line 483.
What does _stream() call?
_stream() calls 2 function(s): _convert_chunk_to_message_chunk, _create_message_dicts.
What calls _stream()?
_stream() is called by 1 function(s): _generate.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free