Home / Function/ test_load_response_with_actual_content_is_not_skipped() — langchain Function Reference

test_load_response_with_actual_content_is_not_skipped() — langchain Function Reference

Architecture documentation for the test_load_response_with_actual_content_is_not_skipped() function in test_chat_models.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  db98d6ee_7f93_7855_884d_12c9cca8f232["test_load_response_with_actual_content_is_not_skipped()"]
  9c4a2438_9884_cbb0_3cf5_de8827531653["test_chat_models.py"]
  db98d6ee_7f93_7855_884d_12c9cca8f232 -->|defined in| 9c4a2438_9884_cbb0_3cf5_de8827531653
  style db98d6ee_7f93_7855_884d_12c9cca8f232 fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

libs/partners/ollama/tests/unit_tests/test_chat_models.py lines 286–312

def test_load_response_with_actual_content_is_not_skipped(
    caplog: pytest.LogCaptureFixture,
) -> None:
    """Test load responses with actual content are NOT skipped and log no warning."""
    load_with_content_response = [
        {
            "model": "test-model",
            "created_at": "2025-01-01T00:00:00.000000000Z",
            "done": True,
            "done_reason": "load",
            "message": {"role": "assistant", "content": "This is actual content"},
        }
    ]

    with patch("langchain_ollama.chat_models.Client") as mock_client_class:
        mock_client = MagicMock()
        mock_client_class.return_value = mock_client
        mock_client.chat.return_value = load_with_content_response

        llm = ChatOllama(model="test-model")

        with caplog.at_level(logging.WARNING):
            result = llm.invoke([HumanMessage("Hello")])

        assert result.content == "This is actual content"
        assert result.response_metadata.get("done_reason") == "load"
        assert not caplog.text

Domain

Subdomains

Frequently Asked Questions

What does test_load_response_with_actual_content_is_not_skipped() do?
test_load_response_with_actual_content_is_not_skipped() is a function in the langchain codebase, defined in libs/partners/ollama/tests/unit_tests/test_chat_models.py.
Where is test_load_response_with_actual_content_is_not_skipped() defined?
test_load_response_with_actual_content_is_not_skipped() is defined in libs/partners/ollama/tests/unit_tests/test_chat_models.py at line 286.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free