Home / File/ test_chat_models.py — langchain Source File

test_chat_models.py — langchain Source File

Architecture documentation for test_chat_models.py, a python file in the langchain codebase. 12 imports, 0 dependents.

File python CoreAbstractions MessageSchema 12 imports 9 functions 4 classes

Entity Profile

Dependency Diagram

graph LR
  d2cf815c_5ff6_56e0_564d_f07623592652["test_chat_models.py"]
  8e2034b7_ceb8_963f_29fc_2ea6b50ef9b3["typing"]
  d2cf815c_5ff6_56e0_564d_f07623592652 --> 8e2034b7_ceb8_963f_29fc_2ea6b50ef9b3
  525a7d6f_f455_56e3_854a_c8a7da4a1417["unittest.mock"]
  d2cf815c_5ff6_56e0_564d_f07623592652 --> 525a7d6f_f455_56e3_854a_c8a7da4a1417
  120e2591_3e15_b895_72b6_cb26195e40a6["pytest"]
  d2cf815c_5ff6_56e0_564d_f07623592652 --> 120e2591_3e15_b895_72b6_cb26195e40a6
  1803c8c1_a347_1256_1454_9f04c3553d93["httpx"]
  d2cf815c_5ff6_56e0_564d_f07623592652 --> 1803c8c1_a347_1256_1454_9f04c3553d93
  4eb42b7a_5c64_04cb_fcec_1401d5c10628["langchain_core.messages.ai"]
  d2cf815c_5ff6_56e0_564d_f07623592652 --> 4eb42b7a_5c64_04cb_fcec_1401d5c10628
  699f0cfe_f0f4_e8ce_6195_bb7fdfae37f4["langchain_core.messages.human"]
  d2cf815c_5ff6_56e0_564d_f07623592652 --> 699f0cfe_f0f4_e8ce_6195_bb7fdfae37f4
  552bc7bf_c1ac_965d_e157_ee750ab1993c["langchain_core.messages.tool"]
  d2cf815c_5ff6_56e0_564d_f07623592652 --> 552bc7bf_c1ac_965d_e157_ee750ab1993c
  43d88577_548b_2248_b01b_7987bae85dcc["langchain_core.tools"]
  d2cf815c_5ff6_56e0_564d_f07623592652 --> 43d88577_548b_2248_b01b_7987bae85dcc
  e36ef4a1_87ee_d91e_3f75_05e353ec925c["ollama"]
  d2cf815c_5ff6_56e0_564d_f07623592652 --> e36ef4a1_87ee_d91e_3f75_05e353ec925c
  6e58aaea_f08e_c099_3cc7_f9567bfb1ae7["pydantic"]
  d2cf815c_5ff6_56e0_564d_f07623592652 --> 6e58aaea_f08e_c099_3cc7_f9567bfb1ae7
  91721f45_4909_e489_8c1f_084f8bd87145["typing_extensions"]
  d2cf815c_5ff6_56e0_564d_f07623592652 --> 91721f45_4909_e489_8c1f_084f8bd87145
  db45efca_f405_3cb7_f827_a9e6174e5158["langchain_ollama"]
  d2cf815c_5ff6_56e0_564d_f07623592652 --> db45efca_f405_3cb7_f827_a9e6174e5158
  style d2cf815c_5ff6_56e0_564d_f07623592652 fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

"""Ollama specific chat model integration tests"""

from __future__ import annotations

from typing import Annotated
from unittest.mock import MagicMock, patch

import pytest
from httpx import ConnectError
from langchain_core.messages.ai import AIMessage, AIMessageChunk
from langchain_core.messages.human import HumanMessage
from langchain_core.messages.tool import ToolCallChunk, ToolMessage
from langchain_core.tools import tool
from ollama import ResponseError
from pydantic import BaseModel, Field, ValidationError
from typing_extensions import TypedDict

from langchain_ollama import ChatOllama

DEFAULT_MODEL_NAME = "llama3.1"
REASONING_MODEL_NAME = "gpt-oss:20b"


@tool
def get_current_weather(location: str) -> dict:
    """Gets the current weather in a given location."""
    if "boston" in location.lower():
        return {"temperature": "15°F", "conditions": "snow"}
    return {"temperature": "unknown", "conditions": "unknown"}


@patch("langchain_ollama.chat_models.Client.list")
def test_init_model_not_found(mock_list: MagicMock) -> None:
    """Test that a ValueError is raised when the model is not found."""
    mock_list.side_effect = ValueError("Test model not found")
    with pytest.raises(ValueError) as excinfo:
        ChatOllama(model="non-existent-model", validate_model_on_init=True)
    assert "Test model not found" in str(excinfo.value)


@patch("langchain_ollama.chat_models.Client.list")
def test_init_connection_error(mock_list: MagicMock) -> None:
    """Test that a `ValidationError` is raised on connect failure during init."""
    mock_list.side_effect = ConnectError("Test connection error")

    with pytest.raises(ValidationError) as excinfo:
        ChatOllama(model="any-model", validate_model_on_init=True)
    assert "Failed to connect to Ollama" in str(excinfo.value)


@patch("langchain_ollama.chat_models.Client.list")
def test_init_response_error(mock_list: MagicMock) -> None:
    """Test that a ResponseError is raised."""
    mock_list.side_effect = ResponseError("Test response error")

    with pytest.raises(ValidationError) as excinfo:
        ChatOllama(model="any-model", validate_model_on_init=True)
    assert "Received an error from the Ollama API" in str(excinfo.value)


// ... (236 more lines)

Subdomains

Dependencies

  • httpx
  • langchain_core.messages.ai
  • langchain_core.messages.human
  • langchain_core.messages.tool
  • langchain_core.tools
  • langchain_ollama
  • ollama
  • pydantic
  • pytest
  • typing
  • typing_extensions
  • unittest.mock

Frequently Asked Questions

What does test_chat_models.py do?
test_chat_models.py is a source file in the langchain codebase, written in python. It belongs to the CoreAbstractions domain, MessageSchema subdomain.
What functions are defined in test_chat_models.py?
test_chat_models.py defines 9 function(s): get_current_weather, test_agent_loop, test_init_connection_error, test_init_model_not_found, test_init_response_error, test_structured_output, test_structured_output_deeply_nested, test_tool_astreaming, test_tool_streaming.
What does test_chat_models.py depend on?
test_chat_models.py imports 12 module(s): httpx, langchain_core.messages.ai, langchain_core.messages.human, langchain_core.messages.tool, langchain_core.tools, langchain_ollama, ollama, pydantic, and 4 more.
Where is test_chat_models.py in the architecture?
test_chat_models.py is located at libs/partners/ollama/tests/integration_tests/chat_models/test_chat_models.py (domain: CoreAbstractions, subdomain: MessageSchema, directory: libs/partners/ollama/tests/integration_tests/chat_models).

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free