Home / File/ chat_models.py — langchain Source File

chat_models.py — langchain Source File

Architecture documentation for chat_models.py, a python file in the langchain codebase. 28 imports, 0 dependents.

File python CoreAbstractions MessageSchema 28 imports 7 functions 1 classes

Entity Profile

Dependency Diagram

graph LR
  c9284a6d_f83c_0698_fc0a_b5e98d911196["chat_models.py"]
  d68b71f1_2253_da0f_5a2f_31c9bb5e42f0["ast"]
  c9284a6d_f83c_0698_fc0a_b5e98d911196 --> d68b71f1_2253_da0f_5a2f_31c9bb5e42f0
  7025b240_fdc3_cf68_b72f_f41dac94566b["json"]
  c9284a6d_f83c_0698_fc0a_b5e98d911196 --> 7025b240_fdc3_cf68_b72f_f41dac94566b
  2a7f66a7_8738_3d47_375b_70fcaa6ac169["logging"]
  c9284a6d_f83c_0698_fc0a_b5e98d911196 --> 2a7f66a7_8738_3d47_375b_70fcaa6ac169
  cfe2bde5_180e_e3b0_df2b_55b3ebaca8e7["collections.abc"]
  c9284a6d_f83c_0698_fc0a_b5e98d911196 --> cfe2bde5_180e_e3b0_df2b_55b3ebaca8e7
  7aaf52d4_ee88_411e_980e_bc4beeeb30ad["operator"]
  c9284a6d_f83c_0698_fc0a_b5e98d911196 --> 7aaf52d4_ee88_411e_980e_bc4beeeb30ad
  8e2034b7_ceb8_963f_29fc_2ea6b50ef9b3["typing"]
  c9284a6d_f83c_0698_fc0a_b5e98d911196 --> 8e2034b7_ceb8_963f_29fc_2ea6b50ef9b3
  8dfa0cac_d802_3ccd_f710_43a5e70da3a5["uuid"]
  c9284a6d_f83c_0698_fc0a_b5e98d911196 --> 8dfa0cac_d802_3ccd_f710_43a5e70da3a5
  f3bc7443_c889_119d_0744_aacc3620d8d2["langchain_core.callbacks"]
  c9284a6d_f83c_0698_fc0a_b5e98d911196 --> f3bc7443_c889_119d_0744_aacc3620d8d2
  e8ec017e_6c91_4b34_675f_2a96c5aa9be6["langchain_core.callbacks.manager"]
  c9284a6d_f83c_0698_fc0a_b5e98d911196 --> e8ec017e_6c91_4b34_675f_2a96c5aa9be6
  75137834_4ba7_dc43_7ec5_182c05eceedf["langchain_core.exceptions"]
  c9284a6d_f83c_0698_fc0a_b5e98d911196 --> 75137834_4ba7_dc43_7ec5_182c05eceedf
  ba43b74d_3099_7e1c_aac3_cf594720469e["langchain_core.language_models"]
  c9284a6d_f83c_0698_fc0a_b5e98d911196 --> ba43b74d_3099_7e1c_aac3_cf594720469e
  2312f229_c199_ac88_c29f_62e2a2958404["langchain_core.language_models.chat_models"]
  c9284a6d_f83c_0698_fc0a_b5e98d911196 --> 2312f229_c199_ac88_c29f_62e2a2958404
  d758344f_537f_649e_f467_b9d7442e86df["langchain_core.messages"]
  c9284a6d_f83c_0698_fc0a_b5e98d911196 --> d758344f_537f_649e_f467_b9d7442e86df
  4eb42b7a_5c64_04cb_fcec_1401d5c10628["langchain_core.messages.ai"]
  c9284a6d_f83c_0698_fc0a_b5e98d911196 --> 4eb42b7a_5c64_04cb_fcec_1401d5c10628
  style c9284a6d_f83c_0698_fc0a_b5e98d911196 fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

"""Ollama chat models.

**Input Flow (LangChain -> Ollama)**

`_convert_messages_to_ollama_messages()`:

- Transforms LangChain messages to `ollama.Message` format
- Extracts text content, images (base64), and tool calls

`_chat_params()`:

- Combines messages with model parameters (temperature, top_p, etc.)
- Attaches tools if provided
- Configures reasoning/thinking mode via `think` parameter
- Sets output format (raw, JSON, or JSON schema)

**Output Flow (Ollama -> LangChain)**

1. **Ollama Response**

Stream dictionary chunks containing:
- `message`: Dict with `role`, `content`, `tool_calls`, `thinking`
- `done`: Boolean indicating completion
- `done_reason`: Reason for completion (`stop`, `length`, `load`)
- Token counts/timing metadata

2. **Response Processing** (`_iterate_over_stream()`)

- Extracts content from `message.content`
- Parses tool calls into `ToolCall`s
- Separates reasoning content when `reasoning=True` (stored in `additional_kwargs`)
- Builds usage metadata from token counts

3. **LangChain Output** (`ChatGenerationChunk` -> `AIMessage`)

- **Streaming**: Yields `ChatGenerationChunk` with `AIMessageChunk` content
- **Non-streaming**: Returns `ChatResult` with complete `AIMessage`
- Tool calls attached to `AIMessage.tool_calls`
- Reasoning content in `AIMessage.additional_kwargs['reasoning_content']`
"""

from __future__ import annotations

import ast
import json
import logging
from collections.abc import AsyncIterator, Callable, Iterator, Mapping, Sequence
from operator import itemgetter
from typing import Any, Literal, cast
from uuid import uuid4

from langchain_core.callbacks import CallbackManagerForLLMRun
from langchain_core.callbacks.manager import AsyncCallbackManagerForLLMRun
from langchain_core.exceptions import OutputParserException
from langchain_core.language_models import LanguageModelInput
from langchain_core.language_models.chat_models import BaseChatModel, LangSmithParams
from langchain_core.messages import (
    AIMessage,
    AIMessageChunk,
    BaseMessage,
// ... (1547 more lines)

Subdomains

Classes

Dependencies

  • ast
  • collections.abc
  • json
  • langchain_core.callbacks
  • langchain_core.callbacks.manager
  • langchain_core.exceptions
  • langchain_core.language_models
  • langchain_core.language_models.chat_models
  • langchain_core.messages
  • langchain_core.messages.ai
  • langchain_core.messages.tool
  • langchain_core.output_parsers
  • langchain_core.outputs
  • langchain_core.runnables
  • langchain_core.tools
  • langchain_core.utils.function_calling
  • langchain_core.utils.pydantic
  • langchain_ollama._compat
  • langchain_ollama._utils
  • logging
  • ollama
  • operator
  • pydantic
  • pydantic.json_schema
  • pydantic.v1
  • typing
  • typing_extensions
  • uuid

Frequently Asked Questions

What does chat_models.py do?
chat_models.py is a source file in the langchain codebase, written in python. It belongs to the CoreAbstractions domain, MessageSchema subdomain.
What functions are defined in chat_models.py?
chat_models.py defines 7 function(s): _get_image_from_data_content_block, _get_tool_calls_from_response, _get_usage_metadata_from_generation_info, _is_pydantic_class, _lc_tool_call_to_openai_tool_call, _parse_arguments_from_tool_call, _parse_json_string.
What does chat_models.py depend on?
chat_models.py imports 28 module(s): ast, collections.abc, json, langchain_core.callbacks, langchain_core.callbacks.manager, langchain_core.exceptions, langchain_core.language_models, langchain_core.language_models.chat_models, and 20 more.
Where is chat_models.py in the architecture?
chat_models.py is located at libs/partners/ollama/langchain_ollama/chat_models.py (domain: CoreAbstractions, subdomain: MessageSchema, directory: libs/partners/ollama/langchain_ollama).

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free