test_stream_reasoning_none() — langchain Function Reference
Architecture documentation for the test_stream_reasoning_none() function in test_chat_models_reasoning.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD d121a5dc_407b_75d4_29c4_69e9c35eb815["test_stream_reasoning_none()"] 5a5c2d7b_4823_4697_a3e1_c5e1c3fce238["test_chat_models_reasoning.py"] d121a5dc_407b_75d4_29c4_69e9c35eb815 -->|defined in| 5a5c2d7b_4823_4697_a3e1_c5e1c3fce238 style d121a5dc_407b_75d4_29c4_69e9c35eb815 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/ollama/tests/integration_tests/chat_models/test_chat_models_reasoning.py lines 48–75
async def test_stream_reasoning_none(model: str, use_async: bool) -> None:
"""Test streaming with `reasoning=None`."""
llm = ChatOllama(model=model, num_ctx=2**12, reasoning=None)
messages = [
{
"role": "user",
"content": SAMPLE,
}
]
result = None
if use_async:
async for chunk in llm.astream(messages):
assert isinstance(chunk, BaseMessageChunk)
if result is None:
result = chunk
continue
result += chunk
else:
for chunk in llm.stream(messages):
assert isinstance(chunk, BaseMessageChunk)
if result is None:
result = chunk
continue
result += chunk
assert isinstance(result, AIMessageChunk)
assert result.content
# reasoning_content is only captured when reasoning=True
assert "reasoning_content" not in result.additional_kwargs
Domain
Subdomains
Source
Frequently Asked Questions
What does test_stream_reasoning_none() do?
test_stream_reasoning_none() is a function in the langchain codebase, defined in libs/partners/ollama/tests/integration_tests/chat_models/test_chat_models_reasoning.py.
Where is test_stream_reasoning_none() defined?
test_stream_reasoning_none() is defined in libs/partners/ollama/tests/integration_tests/chat_models/test_chat_models_reasoning.py at line 48.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free