Home / File/ test_llms.py — langchain Source File

test_llms.py — langchain Source File

Architecture documentation for test_llms.py, a python file in the langchain codebase. 3 imports, 0 dependents.

File python CoreAbstractions MessageSchema 3 imports 4 functions

Entity Profile

Dependency Diagram

graph LR
  50e82ab9_0a4d_4c97_a0b8_29fe19574574["test_llms.py"]
  8e2034b7_ceb8_963f_29fc_2ea6b50ef9b3["typing"]
  50e82ab9_0a4d_4c97_a0b8_29fe19574574 --> 8e2034b7_ceb8_963f_29fc_2ea6b50ef9b3
  525a7d6f_f455_56e3_854a_c8a7da4a1417["unittest.mock"]
  50e82ab9_0a4d_4c97_a0b8_29fe19574574 --> 525a7d6f_f455_56e3_854a_c8a7da4a1417
  db45efca_f405_3cb7_f827_a9e6174e5158["langchain_ollama"]
  50e82ab9_0a4d_4c97_a0b8_29fe19574574 --> db45efca_f405_3cb7_f827_a9e6174e5158
  style 50e82ab9_0a4d_4c97_a0b8_29fe19574574 fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

"""Test Ollama Chat API wrapper."""

from typing import Any
from unittest.mock import patch

from langchain_ollama import OllamaLLM

MODEL_NAME = "llama3.1"


def test_initialization() -> None:
    """Test integration initialization."""
    OllamaLLM(model=MODEL_NAME)


def test_model_params() -> None:
    """Test standard tracing params"""
    llm = OllamaLLM(model=MODEL_NAME)
    ls_params = llm._get_ls_params()
    assert ls_params == {
        "ls_provider": "ollama",
        "ls_model_type": "llm",
        "ls_model_name": MODEL_NAME,
    }

    llm = OllamaLLM(model=MODEL_NAME, num_predict=3)
    ls_params = llm._get_ls_params()
    assert ls_params == {
        "ls_provider": "ollama",
        "ls_model_type": "llm",
        "ls_model_name": MODEL_NAME,
        "ls_max_tokens": 3,
    }


@patch("langchain_ollama.llms.validate_model")
def test_validate_model_on_init(mock_validate_model: Any) -> None:
    """Test that the model is validated on initialization when requested."""
    OllamaLLM(model=MODEL_NAME, validate_model_on_init=True)
    mock_validate_model.assert_called_once()
    mock_validate_model.reset_mock()

    OllamaLLM(model=MODEL_NAME, validate_model_on_init=False)
    mock_validate_model.assert_not_called()
    OllamaLLM(model=MODEL_NAME)
    mock_validate_model.assert_not_called()


def test_reasoning_aggregation() -> None:
    """Test that reasoning chunks are aggregated into final response."""
    llm = OllamaLLM(model=MODEL_NAME, reasoning=True)
    prompts = ["some prompt"]
    mock_stream = [
        {"thinking": "I am thinking.", "done": False},
        {"thinking": " Still thinking.", "done": False},
        {"response": "Final Answer.", "done": True},
    ]

    with patch.object(llm, "_create_generate_stream") as mock_stream_method:
        mock_stream_method.return_value = iter(mock_stream)
        result = llm.generate(prompts)

    assert result.generations[0][0].generation_info is not None
    assert (
        result.generations[0][0].generation_info["thinking"]
        == "I am thinking. Still thinking."
    )

Subdomains

Dependencies

  • langchain_ollama
  • typing
  • unittest.mock

Frequently Asked Questions

What does test_llms.py do?
test_llms.py is a source file in the langchain codebase, written in python. It belongs to the CoreAbstractions domain, MessageSchema subdomain.
What functions are defined in test_llms.py?
test_llms.py defines 4 function(s): test_initialization, test_model_params, test_reasoning_aggregation, test_validate_model_on_init.
What does test_llms.py depend on?
test_llms.py imports 3 module(s): langchain_ollama, typing, unittest.mock.
Where is test_llms.py in the architecture?
test_llms.py is located at libs/partners/ollama/tests/unit_tests/test_llms.py (domain: CoreAbstractions, subdomain: MessageSchema, directory: libs/partners/ollama/tests/unit_tests).

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free