test_chat_models.py — langchain Source File
Architecture documentation for test_chat_models.py, a python file in the langchain codebase. 9 imports, 0 dependents.
Entity Profile
Dependency Diagram
graph LR 9003580b_9bcf_c959_a278_ba1963301452["test_chat_models.py"] e27da29f_a1f7_49f3_84d5_6be4cb4125c8["logging"] 9003580b_9bcf_c959_a278_ba1963301452 --> e27da29f_a1f7_49f3_84d5_6be4cb4125c8 996b2db9_46dd_901f_f7eb_068bafab4b12["time"] 9003580b_9bcf_c959_a278_ba1963301452 --> 996b2db9_46dd_901f_f7eb_068bafab4b12 feec1ec4_6917_867b_d228_b134d0ff8099["typing"] 9003580b_9bcf_c959_a278_ba1963301452 --> feec1ec4_6917_867b_d228_b134d0ff8099 f69d6389_263d_68a4_7fbf_f14c0602a9ba["pytest"] 9003580b_9bcf_c959_a278_ba1963301452 --> f69d6389_263d_68a4_7fbf_f14c0602a9ba d2b62b81_6a74_9153_fcd4_ff7470a9b3d2["httpx"] 9003580b_9bcf_c959_a278_ba1963301452 --> d2b62b81_6a74_9153_fcd4_ff7470a9b3d2 9444498b_8066_55c7_b3a2_1d90c4162a32["langchain_core.messages"] 9003580b_9bcf_c959_a278_ba1963301452 --> 9444498b_8066_55c7_b3a2_1d90c4162a32 dd5e7909_a646_84f1_497b_cae69735550e["pydantic"] 9003580b_9bcf_c959_a278_ba1963301452 --> dd5e7909_a646_84f1_497b_cae69735550e f85fae70_1011_eaec_151c_4083140ae9e5["typing_extensions"] 9003580b_9bcf_c959_a278_ba1963301452 --> f85fae70_1011_eaec_151c_4083140ae9e5 e6fa1d7a_9364_a001_abb8_3912b7d2ed4a["langchain_mistralai.chat_models"] 9003580b_9bcf_c959_a278_ba1963301452 --> e6fa1d7a_9364_a001_abb8_3912b7d2ed4a style 9003580b_9bcf_c959_a278_ba1963301452 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
"""Test ChatMistral chat model."""
from __future__ import annotations
import logging
import time
from typing import Any
import pytest
from httpx import ReadTimeout
from langchain_core.messages import AIMessageChunk, BaseMessageChunk
from pydantic import BaseModel
from typing_extensions import TypedDict
from langchain_mistralai.chat_models import ChatMistralAI
async def test_astream() -> None:
"""Test streaming tokens from ChatMistralAI."""
llm = ChatMistralAI()
full: BaseMessageChunk | None = None
chunks_with_token_counts = 0
chunks_with_response_metadata = 0
async for token in llm.astream("Hello"):
assert isinstance(token, AIMessageChunk)
assert isinstance(token.content, str)
full = token if full is None else full + token
if token.usage_metadata is not None:
chunks_with_token_counts += 1
if token.response_metadata and not set(token.response_metadata.keys()).issubset(
{"model_provider", "output_version"}
):
chunks_with_response_metadata += 1
if chunks_with_token_counts != 1 or chunks_with_response_metadata != 1:
msg = (
"Expected exactly one chunk with token counts or response_metadata. "
"AIMessageChunk aggregation adds / appends counts and metadata. Check that "
"this is behaving properly."
)
raise AssertionError(msg)
assert isinstance(full, AIMessageChunk)
assert full.usage_metadata is not None
assert full.usage_metadata["input_tokens"] > 0
assert full.usage_metadata["output_tokens"] > 0
assert (
full.usage_metadata["input_tokens"] + full.usage_metadata["output_tokens"]
== full.usage_metadata["total_tokens"]
)
assert isinstance(full.response_metadata["model_name"], str)
assert full.response_metadata["model_name"]
class Book(BaseModel):
name: str
authors: list[str]
class BookDict(TypedDict):
name: str
// ... (136 more lines)
Domain
Subdomains
Functions
Dependencies
- httpx
- langchain_core.messages
- langchain_mistralai.chat_models
- logging
- pydantic
- pytest
- time
- typing
- typing_extensions
Source
Frequently Asked Questions
What does test_chat_models.py do?
test_chat_models.py is a source file in the langchain codebase, written in python. It belongs to the LangChainCore domain, LanguageModelBase subdomain.
What functions are defined in test_chat_models.py?
test_chat_models.py defines 7 function(s): _check_parsed_result, test_astream, test_reasoning, test_reasoning_v1, test_retry_parameters, test_structured_output_json_schema, test_structured_output_json_schema_async.
What does test_chat_models.py depend on?
test_chat_models.py imports 9 module(s): httpx, langchain_core.messages, langchain_mistralai.chat_models, logging, pydantic, pytest, time, typing, and 1 more.
Where is test_chat_models.py in the architecture?
test_chat_models.py is located at libs/partners/mistralai/tests/integration_tests/test_chat_models.py (domain: LangChainCore, subdomain: LanguageModelBase, directory: libs/partners/mistralai/tests/integration_tests).
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free