_agenerate() — langchain Function Reference
Architecture documentation for the _agenerate() function in chat_models.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD c693a2d2_0366_3ee6_c134_76b810abb28a["_agenerate()"] 19e4be00_71fb_5390_6768_f6e6158f49b4["ChatOllama"] c693a2d2_0366_3ee6_c134_76b810abb28a -->|defined in| 19e4be00_71fb_5390_6768_f6e6158f49b4 2e66c349_d5d3_62e8_92b6_0bc2344b7f7b["_achat_stream_with_aggregation()"] c693a2d2_0366_3ee6_c134_76b810abb28a -->|calls| 2e66c349_d5d3_62e8_92b6_0bc2344b7f7b style c693a2d2_0366_3ee6_c134_76b810abb28a fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/ollama/langchain_ollama/chat_models.py lines 1204–1226
async def _agenerate(
self,
messages: list[BaseMessage],
stop: list[str] | None = None,
run_manager: AsyncCallbackManagerForLLMRun | None = None,
**kwargs: Any,
) -> ChatResult:
final_chunk = await self._achat_stream_with_aggregation(
messages, stop, run_manager, verbose=self.verbose, **kwargs
)
generation_info = final_chunk.generation_info
chat_generation = ChatGeneration(
message=AIMessage(
content=final_chunk.text,
usage_metadata=cast(
"AIMessageChunk", final_chunk.message
).usage_metadata,
tool_calls=cast("AIMessageChunk", final_chunk.message).tool_calls,
additional_kwargs=final_chunk.message.additional_kwargs,
),
generation_info=generation_info,
)
return ChatResult(generations=[chat_generation])
Domain
Subdomains
Source
Frequently Asked Questions
What does _agenerate() do?
_agenerate() is a function in the langchain codebase, defined in libs/partners/ollama/langchain_ollama/chat_models.py.
Where is _agenerate() defined?
_agenerate() is defined in libs/partners/ollama/langchain_ollama/chat_models.py at line 1204.
What does _agenerate() call?
_agenerate() calls 1 function(s): _achat_stream_with_aggregation.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free