_convert_messages_to_ollama_messages() — langchain Function Reference
Architecture documentation for the _convert_messages_to_ollama_messages() function in chat_models.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 24455ecf_f2a0_74af_3dc1_500c0b6a35fa["_convert_messages_to_ollama_messages()"] 19e4be00_71fb_5390_6768_f6e6158f49b4["ChatOllama"] 24455ecf_f2a0_74af_3dc1_500c0b6a35fa -->|defined in| 19e4be00_71fb_5390_6768_f6e6158f49b4 b4299926_33ce_4715_e8bf_06ff61a39c14["_chat_params()"] b4299926_33ce_4715_e8bf_06ff61a39c14 -->|calls| 24455ecf_f2a0_74af_3dc1_500c0b6a35fa 02cfbd90_3b04_1f57_a91d_7ca895ebd98c["_lc_tool_call_to_openai_tool_call()"] 24455ecf_f2a0_74af_3dc1_500c0b6a35fa -->|calls| 02cfbd90_3b04_1f57_a91d_7ca895ebd98c 1883aa59_cdcf_1ce5_e67d_a17cd42ce5c1["_get_image_from_data_content_block()"] 24455ecf_f2a0_74af_3dc1_500c0b6a35fa -->|calls| 1883aa59_cdcf_1ce5_e67d_a17cd42ce5c1 style 24455ecf_f2a0_74af_3dc1_500c0b6a35fa fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/ollama/langchain_ollama/chat_models.py lines 812–929
def _convert_messages_to_ollama_messages(
self, messages: list[BaseMessage]
) -> Sequence[Message]:
"""Convert a BaseMessage list to list of messages for Ollama to consume.
Args:
messages: List of BaseMessage to convert.
Returns:
List of messages in Ollama format.
"""
for idx, message in enumerate(messages):
# Handle message content written in v1 format
if (
isinstance(message, AIMessage)
and message.response_metadata.get("output_version") == "v1"
):
# Unpack known v1 content to Ollama format for the request
# Most types are passed through unchanged
messages[idx] = message.model_copy(
update={
"content": _convert_from_v1_to_ollama(
cast("list[types.ContentBlock]", message.content),
message.response_metadata.get("model_provider"),
)
}
)
ollama_messages: list = []
for message in messages:
role: str
tool_call_id: str | None = None
tool_calls: list[dict[str, Any]] | None = None
if isinstance(message, HumanMessage):
role = "user"
elif isinstance(message, AIMessage):
role = "assistant"
tool_calls = (
[
_lc_tool_call_to_openai_tool_call(tool_call)
for tool_call in message.tool_calls
]
if message.tool_calls
else None
)
elif isinstance(message, SystemMessage):
role = "system"
elif isinstance(message, ChatMessage):
role = message.role
elif isinstance(message, ToolMessage):
role = "tool"
tool_call_id = message.tool_call_id
else:
msg = "Received unsupported message type for Ollama."
raise TypeError(msg)
content = ""
images = []
if isinstance(message.content, str):
content = message.content
else: # List
for content_part in message.content:
if isinstance(content_part, str):
content += f"\n{content_part}"
elif content_part.get("type") == "text":
content += f"\n{content_part['text']}"
elif content_part.get("type") == "tool_use":
continue
elif content_part.get("type") == "image_url":
image_url = None
temp_image_url = content_part.get("image_url")
if isinstance(temp_image_url, str):
image_url = temp_image_url
elif (
isinstance(temp_image_url, dict)
and "url" in temp_image_url
and isinstance(temp_image_url["url"], str)
):
image_url = temp_image_url["url"]
else:
msg = (
Domain
Subdomains
Called By
Source
Frequently Asked Questions
What does _convert_messages_to_ollama_messages() do?
_convert_messages_to_ollama_messages() is a function in the langchain codebase, defined in libs/partners/ollama/langchain_ollama/chat_models.py.
Where is _convert_messages_to_ollama_messages() defined?
_convert_messages_to_ollama_messages() is defined in libs/partners/ollama/langchain_ollama/chat_models.py at line 812.
What does _convert_messages_to_ollama_messages() call?
_convert_messages_to_ollama_messages() calls 2 function(s): _get_image_from_data_content_block, _lc_tool_call_to_openai_tool_call.
What calls _convert_messages_to_ollama_messages()?
_convert_messages_to_ollama_messages() is called by 1 function(s): _chat_params.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free