Home / Class/ ModelWithSyncStream Class — langchain Architecture

ModelWithSyncStream Class — langchain Architecture

Architecture documentation for the ModelWithSyncStream class in test_base.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  04857344_da22_c820_6d23_36f6a454fafb["ModelWithSyncStream"]
  d009a608_c505_bd50_7200_0de8a69ba4b7["BaseChatModel"]
  04857344_da22_c820_6d23_36f6a454fafb -->|extends| d009a608_c505_bd50_7200_0de8a69ba4b7
  8830054d_ac1e_daa9_c6c5_ff55b10d0bf3["test_base.py"]
  04857344_da22_c820_6d23_36f6a454fafb -->|defined in| 8830054d_ac1e_daa9_c6c5_ff55b10d0bf3
  766185bd_7cee_246b_f58e_8c791ffe0981["_generate()"]
  04857344_da22_c820_6d23_36f6a454fafb -->|method| 766185bd_7cee_246b_f58e_8c791ffe0981
  6737dac9_9b34_de03_c0ec_767f3937fe4f["_stream()"]
  04857344_da22_c820_6d23_36f6a454fafb -->|method| 6737dac9_9b34_de03_c0ec_767f3937fe4f
  f437a49b_8f7e_0784_9a1a_2b7239178b1c["_llm_type()"]
  04857344_da22_c820_6d23_36f6a454fafb -->|method| f437a49b_8f7e_0784_9a1a_2b7239178b1c

Relationship Graph

Source Code

libs/core/tests/unit_tests/language_models/chat_models/test_base.py lines 220–247

    class ModelWithSyncStream(BaseChatModel):
        def _generate(
            self,
            messages: list[BaseMessage],
            stop: list[str] | None = None,
            run_manager: CallbackManagerForLLMRun | None = None,
            **kwargs: Any,
        ) -> ChatResult:
            """Top Level call."""
            raise NotImplementedError

        @override
        def _stream(
            self,
            messages: list[BaseMessage],
            stop: list[str] | None = None,
            run_manager: CallbackManagerForLLMRun | None = None,
            **kwargs: Any,
        ) -> Iterator[ChatGenerationChunk]:
            """Stream the output of the model."""
            yield ChatGenerationChunk(message=AIMessageChunk(content="a"))
            yield ChatGenerationChunk(
                message=AIMessageChunk(content="b", chunk_position="last")
            )

        @property
        def _llm_type(self) -> str:
            return "fake-chat-model"

Extends

Frequently Asked Questions

What is the ModelWithSyncStream class?
ModelWithSyncStream is a class in the langchain codebase, defined in libs/core/tests/unit_tests/language_models/chat_models/test_base.py.
Where is ModelWithSyncStream defined?
ModelWithSyncStream is defined in libs/core/tests/unit_tests/language_models/chat_models/test_base.py at line 220.
What does ModelWithSyncStream extend?
ModelWithSyncStream extends BaseChatModel.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free