_chat_params() — langchain Function Reference
Architecture documentation for the _chat_params() function in chat_models.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD b4299926_33ce_4715_e8bf_06ff61a39c14["_chat_params()"] 19e4be00_71fb_5390_6768_f6e6158f49b4["ChatOllama"] b4299926_33ce_4715_e8bf_06ff61a39c14 -->|defined in| 19e4be00_71fb_5390_6768_f6e6158f49b4 e9825a4f_e9d8_3008_6f5d_48284410cad7["_acreate_chat_stream()"] e9825a4f_e9d8_3008_6f5d_48284410cad7 -->|calls| b4299926_33ce_4715_e8bf_06ff61a39c14 54502394_47d8_fe09_4e7a_ec02c5d30d0c["_create_chat_stream()"] 54502394_47d8_fe09_4e7a_ec02c5d30d0c -->|calls| b4299926_33ce_4715_e8bf_06ff61a39c14 24455ecf_f2a0_74af_3dc1_500c0b6a35fa["_convert_messages_to_ollama_messages()"] b4299926_33ce_4715_e8bf_06ff61a39c14 -->|calls| 24455ecf_f2a0_74af_3dc1_500c0b6a35fa style b4299926_33ce_4715_e8bf_06ff61a39c14 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/ollama/langchain_ollama/chat_models.py lines 720–788
def _chat_params(
self,
messages: list[BaseMessage],
stop: list[str] | None = None,
**kwargs: Any,
) -> dict[str, Any]:
"""Assemble the parameters for a chat completion request.
Args:
messages: List of LangChain messages to send to the model.
stop: Optional list of stop tokens to use for this invocation.
**kwargs: Additional keyword arguments to include in the request.
Returns:
A dictionary of parameters to pass to the Ollama client.
"""
ollama_messages = self._convert_messages_to_ollama_messages(messages)
if self.stop is not None and stop is not None:
msg = "`stop` found in both the input and default params."
raise ValueError(msg)
if self.stop is not None:
stop = self.stop
options_dict = kwargs.pop("options", None)
if options_dict is None:
# Only include parameters that are explicitly set (not None)
options_dict = {
k: v
for k, v in {
"mirostat": self.mirostat,
"mirostat_eta": self.mirostat_eta,
"mirostat_tau": self.mirostat_tau,
"num_ctx": self.num_ctx,
"num_gpu": self.num_gpu,
"num_thread": self.num_thread,
"num_predict": self.num_predict,
"repeat_last_n": self.repeat_last_n,
"repeat_penalty": self.repeat_penalty,
"temperature": self.temperature,
"seed": self.seed,
"stop": self.stop if stop is None else stop,
"tfs_z": self.tfs_z,
"top_k": self.top_k,
"top_p": self.top_p,
}.items()
if v is not None
}
params = {
"messages": ollama_messages,
"stream": kwargs.pop("stream", True),
"model": kwargs.pop("model", self.model),
"think": kwargs.pop("reasoning", self.reasoning),
"format": kwargs.pop("format", self.format),
"options": options_dict,
"keep_alive": kwargs.pop("keep_alive", self.keep_alive),
**kwargs,
}
# Filter out 'strict' argument if present, as it is not supported by Ollama
# but may be passed by upstream libraries (e.g. LangChain ProviderStrategy)
if "strict" in params:
params.pop("strict")
if tools := kwargs.get("tools"):
params["tools"] = tools
return params
Domain
Subdomains
Source
Frequently Asked Questions
What does _chat_params() do?
_chat_params() is a function in the langchain codebase, defined in libs/partners/ollama/langchain_ollama/chat_models.py.
Where is _chat_params() defined?
_chat_params() is defined in libs/partners/ollama/langchain_ollama/chat_models.py at line 720.
What does _chat_params() call?
_chat_params() calls 1 function(s): _convert_messages_to_ollama_messages.
What calls _chat_params()?
_chat_params() is called by 2 function(s): _acreate_chat_stream, _create_chat_stream.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free