test_load_followed_by_content_response() — langchain Function Reference
Architecture documentation for the test_load_followed_by_content_response() function in test_chat_models.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD eab97f35_2596_d4dc_a1fb_c5c3803d49c6["test_load_followed_by_content_response()"] 9c4a2438_9884_cbb0_3cf5_de8827531653["test_chat_models.py"] eab97f35_2596_d4dc_a1fb_c5c3803d49c6 -->|defined in| 9c4a2438_9884_cbb0_3cf5_de8827531653 style eab97f35_2596_d4dc_a1fb_c5c3803d49c6 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/ollama/tests/unit_tests/test_chat_models.py lines 247–283
def test_load_followed_by_content_response(
caplog: pytest.LogCaptureFixture,
) -> None:
"""Test load responses log a warning and are skipped when followed by content."""
load_then_content_response = [
{
"model": "test-model",
"created_at": "2025-01-01T00:00:00.000000000Z",
"done": True,
"done_reason": "load",
"message": {"role": "assistant", "content": ""},
},
{
"model": "test-model",
"created_at": "2025-01-01T00:00:01.000000000Z",
"done": True,
"done_reason": "stop",
"message": {
"role": "assistant",
"content": "Hello! How can I help you today?",
},
},
]
with patch("langchain_ollama.chat_models.Client") as mock_client_class:
mock_client = MagicMock()
mock_client_class.return_value = mock_client
mock_client.chat.return_value = load_then_content_response
llm = ChatOllama(model="test-model")
with caplog.at_level(logging.WARNING):
result = llm.invoke([HumanMessage("Hello")])
assert "Ollama returned empty response with done_reason='load'" in caplog.text
assert result.content == "Hello! How can I help you today?"
assert result.response_metadata.get("done_reason") == "stop"
Domain
Subdomains
Source
Frequently Asked Questions
What does test_load_followed_by_content_response() do?
test_load_followed_by_content_response() is a function in the langchain codebase, defined in libs/partners/ollama/tests/unit_tests/test_chat_models.py.
Where is test_load_followed_by_content_response() defined?
test_load_followed_by_content_response() is defined in libs/partners/ollama/tests/unit_tests/test_chat_models.py at line 247.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free