_generate() — langchain Function Reference
Architecture documentation for the _generate() function in chat_models.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 907cc29e_96a8_c0ba_194e_95ac0fa628fb["_generate()"] 19e4be00_71fb_5390_6768_f6e6158f49b4["ChatOllama"] 907cc29e_96a8_c0ba_194e_95ac0fa628fb -->|defined in| 19e4be00_71fb_5390_6768_f6e6158f49b4 a9bec111_57fc_1b0b_ba6f_432ebeb29556["_chat_stream_with_aggregation()"] 907cc29e_96a8_c0ba_194e_95ac0fa628fb -->|calls| a9bec111_57fc_1b0b_ba6f_432ebeb29556 style 907cc29e_96a8_c0ba_194e_95ac0fa628fb fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/ollama/langchain_ollama/chat_models.py lines 1026–1048
def _generate(
self,
messages: list[BaseMessage],
stop: list[str] | None = None,
run_manager: CallbackManagerForLLMRun | None = None,
**kwargs: Any,
) -> ChatResult:
final_chunk = self._chat_stream_with_aggregation(
messages, stop, run_manager, verbose=self.verbose, **kwargs
)
generation_info = final_chunk.generation_info
chat_generation = ChatGeneration(
message=AIMessage(
content=final_chunk.text,
usage_metadata=cast(
"AIMessageChunk", final_chunk.message
).usage_metadata,
tool_calls=cast("AIMessageChunk", final_chunk.message).tool_calls,
additional_kwargs=final_chunk.message.additional_kwargs,
),
generation_info=generation_info,
)
return ChatResult(generations=[chat_generation])
Domain
Subdomains
Source
Frequently Asked Questions
What does _generate() do?
_generate() is a function in the langchain codebase, defined in libs/partners/ollama/langchain_ollama/chat_models.py.
Where is _generate() defined?
_generate() is defined in libs/partners/ollama/langchain_ollama/chat_models.py at line 1026.
What does _generate() call?
_generate() calls 1 function(s): _chat_stream_with_aggregation.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free