_stream_with_aggregation() — langchain Function Reference
Architecture documentation for the _stream_with_aggregation() function in llms.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 485962b7_8158_7fa1_e16f_5af052f86e09["_stream_with_aggregation()"] 6220540a_4afa_4d39_cdb6_bb2f4e26fe5f["OllamaLLM"] 485962b7_8158_7fa1_e16f_5af052f86e09 -->|defined in| 6220540a_4afa_4d39_cdb6_bb2f4e26fe5f ba978ac5_9b81_aa68_4fc3_ce3b4b3d5bba["_generate()"] ba978ac5_9b81_aa68_4fc3_ce3b4b3d5bba -->|calls| 485962b7_8158_7fa1_e16f_5af052f86e09 5a56cfec_c901_d53f_c482_f32ccd466460["_create_generate_stream()"] 485962b7_8158_7fa1_e16f_5af052f86e09 -->|calls| 5a56cfec_c901_d53f_c482_f32ccd466460 style 485962b7_8158_7fa1_e16f_5af052f86e09 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/ollama/langchain_ollama/llms.py lines 409–449
def _stream_with_aggregation(
self,
prompt: str,
stop: list[str] | None = None,
run_manager: CallbackManagerForLLMRun | None = None,
verbose: bool = False, # noqa: FBT002
**kwargs: Any,
) -> GenerationChunk:
final_chunk = None
thinking_content = ""
for stream_resp in self._create_generate_stream(prompt, stop, **kwargs):
if not isinstance(stream_resp, str):
if stream_resp.get("thinking"):
thinking_content += stream_resp["thinking"]
chunk = GenerationChunk(
text=stream_resp.get("response", ""),
generation_info=(
dict(stream_resp) if stream_resp.get("done") is True else None
),
)
if final_chunk is None:
final_chunk = chunk
else:
final_chunk += chunk
if run_manager:
run_manager.on_llm_new_token(
chunk.text,
chunk=chunk,
verbose=verbose,
)
if final_chunk is None:
msg = "No data received from Ollama stream."
raise ValueError(msg)
if thinking_content:
if final_chunk.generation_info:
final_chunk.generation_info["thinking"] = thinking_content
else:
final_chunk.generation_info = {"thinking": thinking_content}
return final_chunk
Domain
Subdomains
Called By
Source
Frequently Asked Questions
What does _stream_with_aggregation() do?
_stream_with_aggregation() is a function in the langchain codebase, defined in libs/partners/ollama/langchain_ollama/llms.py.
Where is _stream_with_aggregation() defined?
_stream_with_aggregation() is defined in libs/partners/ollama/langchain_ollama/llms.py at line 409.
What does _stream_with_aggregation() call?
_stream_with_aggregation() calls 1 function(s): _create_generate_stream.
What calls _stream_with_aggregation()?
_stream_with_aggregation() is called by 1 function(s): _generate.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free