Home / Function/ test_stream() — langchain Function Reference

test_stream() — langchain Function Reference

Architecture documentation for the test_stream() function in chat_models.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  4746cc66_9310_026b_aafb_b0b7f315c15d["test_stream()"]
  971e928f_9c9b_ce7a_b93d_e762f2f5aa54["ChatModelIntegrationTests"]
  4746cc66_9310_026b_aafb_b0b7f315c15d -->|defined in| 971e928f_9c9b_ce7a_b93d_e762f2f5aa54
  style 4746cc66_9310_026b_aafb_b0b7f315c15d fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

libs/standard-tests/langchain_tests/integration_tests/chat_models.py lines 801–858

    def test_stream(self, model: BaseChatModel) -> None:
        """Test to verify that `model.stream(simple_message)` works.

        This should pass for all integrations. Passing this test does not indicate
        a "streaming" implementation, but rather that the model can be used in a
        streaming context.

        ??? question "Troubleshooting"

            First, debug
            `langchain_tests.integration_tests.chat_models.ChatModelIntegrationTests.test_invoke`.
            because `stream` has a default implementation that calls `invoke` and
            yields the result as a single chunk.

            If that test passes but not this one, you should make sure your `_stream`
            method does not raise any exceptions, and that it yields valid
            `langchain_core.outputs.chat_generation.ChatGenerationChunk`
            objects like so:

            ```python
            yield ChatGenerationChunk(message=AIMessageChunk(content="chunk text"))
            ```

            The final chunk must have `chunk_position='last'` to signal stream
            completion. This enables proper parsing of `tool_call_chunks` into
            `tool_calls` on the aggregated message:

            ```python
            for i, token in enumerate(tokens):
                is_last = i == len(tokens) - 1
                yield ChatGenerationChunk(
                    message=AIMessageChunk(
                        content=token,
                        chunk_position="last" if is_last else None,
                    )
                )
            ```
        """
        chunks: list[AIMessageChunk] = []
        full: AIMessageChunk | None = None
        for chunk in model.stream("Hello"):
            assert chunk is not None
            assert isinstance(chunk, AIMessageChunk)
            assert isinstance(chunk.content, str | list)
            chunks.append(chunk)
            full = chunk if full is None else full + chunk
        assert len(chunks) > 0
        assert isinstance(full, AIMessageChunk)
        assert full.content
        assert len(full.content_blocks) == 1
        assert full.content_blocks[0]["type"] == "text"

        # Verify chunk_position signaling
        last_chunk = chunks[-1]
        assert last_chunk.chunk_position == "last", (
            f"Final chunk must have chunk_position='last', "
            f"got {last_chunk.chunk_position!r}"
        )

Domain

Subdomains

Frequently Asked Questions

What does test_stream() do?
test_stream() is a function in the langchain codebase, defined in libs/standard-tests/langchain_tests/integration_tests/chat_models.py.
Where is test_stream() defined?
test_stream() is defined in libs/standard-tests/langchain_tests/integration_tests/chat_models.py at line 801.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free