Home / File/ test_chat_models_reasoning.py — langchain Source File

test_chat_models_reasoning.py — langchain Source File

Architecture documentation for test_chat_models_reasoning.py, a python file in the langchain codebase. 3 imports, 0 dependents.

File python LangChainCore LanguageModelBase 3 imports 7 functions

Entity Profile

Dependency Diagram

graph LR
  5a5c2d7b_4823_4697_a3e1_c5e1c3fce238["test_chat_models_reasoning.py"]
  f69d6389_263d_68a4_7fbf_f14c0602a9ba["pytest"]
  5a5c2d7b_4823_4697_a3e1_c5e1c3fce238 --> f69d6389_263d_68a4_7fbf_f14c0602a9ba
  9444498b_8066_55c7_b3a2_1d90c4162a32["langchain_core.messages"]
  5a5c2d7b_4823_4697_a3e1_c5e1c3fce238 --> 9444498b_8066_55c7_b3a2_1d90c4162a32
  ae89c849_b75a_1118_1aff_d8d9cd2a1b3e["langchain_ollama"]
  5a5c2d7b_4823_4697_a3e1_c5e1c3fce238 --> ae89c849_b75a_1118_1aff_d8d9cd2a1b3e
  style 5a5c2d7b_4823_4697_a3e1_c5e1c3fce238 fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

"""Ollama integration tests for reasoning chat models."""

import pytest
from langchain_core.messages import AIMessageChunk, BaseMessageChunk, HumanMessage

from langchain_ollama import ChatOllama

SAMPLE = "What is 3^3?"

REASONING_MODEL_NAME = "deepseek-r1:1.5b"


@pytest.mark.parametrize("model", [REASONING_MODEL_NAME])
@pytest.mark.parametrize("use_async", [False, True])
async def test_stream_no_reasoning(model: str, use_async: bool) -> None:
    """Test streaming with `reasoning=False`."""
    llm = ChatOllama(model=model, num_ctx=2**12, reasoning=False)
    messages = [
        {
            "role": "user",
            "content": SAMPLE,
        }
    ]
    result = None
    if use_async:
        async for chunk in llm.astream(messages):
            assert isinstance(chunk, BaseMessageChunk)
            if result is None:
                result = chunk
                continue
            result += chunk
    else:
        for chunk in llm.stream(messages):
            assert isinstance(chunk, BaseMessageChunk)
            if result is None:
                result = chunk
                continue
            result += chunk
    assert isinstance(result, AIMessageChunk)
    assert result.content
    assert "<think>" not in result.content
    assert "</think>" not in result.content
    assert "reasoning_content" not in result.additional_kwargs


@pytest.mark.parametrize("model", [REASONING_MODEL_NAME])
@pytest.mark.parametrize("use_async", [False, True])
async def test_stream_reasoning_none(model: str, use_async: bool) -> None:
    """Test streaming with `reasoning=None`."""
    llm = ChatOllama(model=model, num_ctx=2**12, reasoning=None)
    messages = [
        {
            "role": "user",
            "content": SAMPLE,
        }
    ]
    result = None
    if use_async:
        async for chunk in llm.astream(messages):
            assert isinstance(chunk, BaseMessageChunk)
// ... (167 more lines)

Domain

Subdomains

Dependencies

  • langchain_core.messages
  • langchain_ollama
  • pytest

Frequently Asked Questions

What does test_chat_models_reasoning.py do?
test_chat_models_reasoning.py is a source file in the langchain codebase, written in python. It belongs to the LangChainCore domain, LanguageModelBase subdomain.
What functions are defined in test_chat_models_reasoning.py?
test_chat_models_reasoning.py defines 7 function(s): test_invoke_no_reasoning, test_invoke_reasoning_none, test_reasoning_invoke, test_reasoning_modes_behavior, test_reasoning_stream, test_stream_no_reasoning, test_stream_reasoning_none.
What does test_chat_models_reasoning.py depend on?
test_chat_models_reasoning.py imports 3 module(s): langchain_core.messages, langchain_ollama, pytest.
Where is test_chat_models_reasoning.py in the architecture?
test_chat_models_reasoning.py is located at libs/partners/ollama/tests/integration_tests/chat_models/test_chat_models_reasoning.py (domain: LangChainCore, subdomain: LanguageModelBase, directory: libs/partners/ollama/tests/integration_tests/chat_models).

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free