test_stream_no_reasoning() — langchain Function Reference
Architecture documentation for the test_stream_no_reasoning() function in test_chat_models_reasoning.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 96d1aa8e_5fa4_3760_bf9c_cbb87ee20ed3["test_stream_no_reasoning()"] 5a5c2d7b_4823_4697_a3e1_c5e1c3fce238["test_chat_models_reasoning.py"] 96d1aa8e_5fa4_3760_bf9c_cbb87ee20ed3 -->|defined in| 5a5c2d7b_4823_4697_a3e1_c5e1c3fce238 style 96d1aa8e_5fa4_3760_bf9c_cbb87ee20ed3 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/ollama/tests/integration_tests/chat_models/test_chat_models_reasoning.py lines 15–43
async def test_stream_no_reasoning(model: str, use_async: bool) -> None:
"""Test streaming with `reasoning=False`."""
llm = ChatOllama(model=model, num_ctx=2**12, reasoning=False)
messages = [
{
"role": "user",
"content": SAMPLE,
}
]
result = None
if use_async:
async for chunk in llm.astream(messages):
assert isinstance(chunk, BaseMessageChunk)
if result is None:
result = chunk
continue
result += chunk
else:
for chunk in llm.stream(messages):
assert isinstance(chunk, BaseMessageChunk)
if result is None:
result = chunk
continue
result += chunk
assert isinstance(result, AIMessageChunk)
assert result.content
assert "<think>" not in result.content
assert "</think>" not in result.content
assert "reasoning_content" not in result.additional_kwargs
Domain
Subdomains
Source
Frequently Asked Questions
What does test_stream_no_reasoning() do?
test_stream_no_reasoning() is a function in the langchain codebase, defined in libs/partners/ollama/tests/integration_tests/chat_models/test_chat_models_reasoning.py.
Where is test_stream_no_reasoning() defined?
test_stream_no_reasoning() is defined in libs/partners/ollama/tests/integration_tests/chat_models/test_chat_models_reasoning.py at line 15.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free