Home / Class/ ChatMistralAI Class — langchain Architecture

ChatMistralAI Class — langchain Architecture

Architecture documentation for the ChatMistralAI class in chat_models.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  24cf3ac3_ed46_66ab_27f4_5b5d67f0b7f0["ChatMistralAI"]
  48aa29b8_65e7_522f_a445_a441eeb6baff["BaseChatModel"]
  24cf3ac3_ed46_66ab_27f4_5b5d67f0b7f0 -->|extends| 48aa29b8_65e7_522f_a445_a441eeb6baff
  de5a7878_b3fe_95d7_2575_7f534546dc1e["AIMessage"]
  24cf3ac3_ed46_66ab_27f4_5b5d67f0b7f0 -->|extends| de5a7878_b3fe_95d7_2575_7f534546dc1e
  cfb937d2_ce7f_5338_2b62_6452043ac78a["chat_models.py"]
  24cf3ac3_ed46_66ab_27f4_5b5d67f0b7f0 -->|defined in| cfb937d2_ce7f_5338_2b62_6452043ac78a
  68ee1c3d_4f33_f3f6_817a_791b082fa57b["build_extra()"]
  24cf3ac3_ed46_66ab_27f4_5b5d67f0b7f0 -->|method| 68ee1c3d_4f33_f3f6_817a_791b082fa57b
  5e9e6c40_09a9_1ee4_099e_a14363f01969["_default_params()"]
  24cf3ac3_ed46_66ab_27f4_5b5d67f0b7f0 -->|method| 5e9e6c40_09a9_1ee4_099e_a14363f01969
  466fdea3_110c_5713_eca3_4f13aef9b934["_get_ls_params()"]
  24cf3ac3_ed46_66ab_27f4_5b5d67f0b7f0 -->|method| 466fdea3_110c_5713_eca3_4f13aef9b934
  520e9b3b_3ee4_2ee8_a961_92e0738f25ee["_client_params()"]
  24cf3ac3_ed46_66ab_27f4_5b5d67f0b7f0 -->|method| 520e9b3b_3ee4_2ee8_a961_92e0738f25ee
  b8aa7601_1d62_2c3d_dff3_a3d1739e2d8c["completion_with_retry()"]
  24cf3ac3_ed46_66ab_27f4_5b5d67f0b7f0 -->|method| b8aa7601_1d62_2c3d_dff3_a3d1739e2d8c
  22bc90b2_a842_48e1_fb55_124730d80c06["_combine_llm_outputs()"]
  24cf3ac3_ed46_66ab_27f4_5b5d67f0b7f0 -->|method| 22bc90b2_a842_48e1_fb55_124730d80c06
  54d39d0e_fe9e_35ba_f8bf_be9309d20623["validate_environment()"]
  24cf3ac3_ed46_66ab_27f4_5b5d67f0b7f0 -->|method| 54d39d0e_fe9e_35ba_f8bf_be9309d20623
  a59952b6_c544_a64e_4e76_2042a3a43b3e["_set_model_profile()"]
  24cf3ac3_ed46_66ab_27f4_5b5d67f0b7f0 -->|method| a59952b6_c544_a64e_4e76_2042a3a43b3e
  c1f04843_2b34_321d_f754_b18729e7e176["_generate()"]
  24cf3ac3_ed46_66ab_27f4_5b5d67f0b7f0 -->|method| c1f04843_2b34_321d_f754_b18729e7e176
  6cdbf47d_ba54_e03d_a2e2_0ea218fadd33["_create_chat_result()"]
  24cf3ac3_ed46_66ab_27f4_5b5d67f0b7f0 -->|method| 6cdbf47d_ba54_e03d_a2e2_0ea218fadd33
  ef431007_0264_4d4c_0b4c_b5b637d2ca3e["_create_message_dicts()"]
  24cf3ac3_ed46_66ab_27f4_5b5d67f0b7f0 -->|method| ef431007_0264_4d4c_0b4c_b5b637d2ca3e

Relationship Graph

Source Code

libs/partners/mistralai/langchain_mistralai/chat_models.py lines 460–1185

class ChatMistralAI(BaseChatModel):
    """A chat model that uses the Mistral AI API."""

    # The type for client and async_client is ignored because the type is not
    # an Optional after the model is initialized and the model_validator
    # is run.
    client: httpx.Client = Field(  # type: ignore[assignment] # : meta private:
        default=None, exclude=True
    )

    async_client: httpx.AsyncClient = Field(  # type: ignore[assignment] # : meta private:
        default=None, exclude=True
    )

    mistral_api_key: SecretStr | None = Field(
        alias="api_key",
        default_factory=secret_from_env("MISTRAL_API_KEY", default=None),
    )

    endpoint: str | None = Field(default=None, alias="base_url")

    max_retries: int = 5

    timeout: int = 120

    max_concurrent_requests: int = 64

    model: str = Field(default="mistral-small", alias="model_name")

    temperature: float = 0.7

    max_tokens: int | None = None

    top_p: float = 1
    """Decode using nucleus sampling: consider the smallest set of tokens whose
    probability sum is at least `top_p`. Must be in the closed interval
    `[0.0, 1.0]`."""

    random_seed: int | None = None

    safe_mode: bool | None = None

    streaming: bool = False

    model_kwargs: dict[str, Any] = Field(default_factory=dict)
    """Holds any invocation parameters not explicitly specified."""

    model_config = ConfigDict(
        populate_by_name=True,
        arbitrary_types_allowed=True,
    )

    @model_validator(mode="before")
    @classmethod
    def build_extra(cls, values: dict[str, Any]) -> Any:
        """Build extra kwargs from additional params that were passed in."""
        all_required_field_names = get_pydantic_field_names(cls)
        return _build_model_kwargs(values, all_required_field_names)

    @property
    def _default_params(self) -> dict[str, Any]:
        """Get the default parameters for calling the API."""
        defaults = {
            "model": self.model,
            "temperature": self.temperature,
            "max_tokens": self.max_tokens,
            "top_p": self.top_p,
            "random_seed": self.random_seed,
            "safe_prompt": self.safe_mode,
            **self.model_kwargs,
        }
        return {k: v for k, v in defaults.items() if v is not None}

    def _get_ls_params(
        self, stop: list[str] | None = None, **kwargs: Any
    ) -> LangSmithParams:
        """Get standard params for tracing."""
        params = self._get_invocation_params(stop=stop, **kwargs)
        ls_params = LangSmithParams(
            ls_provider="mistral",
            ls_model_name=params.get("model", self.model),

Frequently Asked Questions

What is the ChatMistralAI class?
ChatMistralAI is a class in the langchain codebase, defined in libs/partners/mistralai/langchain_mistralai/chat_models.py.
Where is ChatMistralAI defined?
ChatMistralAI is defined in libs/partners/mistralai/langchain_mistralai/chat_models.py at line 460.
What does ChatMistralAI extend?
ChatMistralAI extends BaseChatModel, AIMessage.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free