Home / File/ llms.py — langchain Source File

llms.py — langchain Source File

Architecture documentation for llms.py, a python file in the langchain codebase. 14 imports, 0 dependents.

Entity Profile

Dependency Diagram

graph LR
  60e64790_a72f_5559_ad87_ee31cb9bb0a9["llms.py"]
  67ec3255_645e_8b6e_1eff_1eb3c648ed95["re"]
  60e64790_a72f_5559_ad87_ee31cb9bb0a9 --> 67ec3255_645e_8b6e_1eff_1eb3c648ed95
  0c635125_6987_b8b3_7ff7_d60249aecde7["warnings"]
  60e64790_a72f_5559_ad87_ee31cb9bb0a9 --> 0c635125_6987_b8b3_7ff7_d60249aecde7
  cfe2bde5_180e_e3b0_df2b_55b3ebaca8e7["collections.abc"]
  60e64790_a72f_5559_ad87_ee31cb9bb0a9 --> cfe2bde5_180e_e3b0_df2b_55b3ebaca8e7
  8e2034b7_ceb8_963f_29fc_2ea6b50ef9b3["typing"]
  60e64790_a72f_5559_ad87_ee31cb9bb0a9 --> 8e2034b7_ceb8_963f_29fc_2ea6b50ef9b3
  ad604472_c022_b119_aeba_5ff8893361cd["anthropic"]
  60e64790_a72f_5559_ad87_ee31cb9bb0a9 --> ad604472_c022_b119_aeba_5ff8893361cd
  f3bc7443_c889_119d_0744_aacc3620d8d2["langchain_core.callbacks"]
  60e64790_a72f_5559_ad87_ee31cb9bb0a9 --> f3bc7443_c889_119d_0744_aacc3620d8d2
  ba43b74d_3099_7e1c_aac3_cf594720469e["langchain_core.language_models"]
  60e64790_a72f_5559_ad87_ee31cb9bb0a9 --> ba43b74d_3099_7e1c_aac3_cf594720469e
  89934eed_a823_2184_acf2_039f48eed5f9["langchain_core.language_models.llms"]
  60e64790_a72f_5559_ad87_ee31cb9bb0a9 --> 89934eed_a823_2184_acf2_039f48eed5f9
  ac2a9b92_4484_491e_1b48_ec85e71e1d58["langchain_core.outputs"]
  60e64790_a72f_5559_ad87_ee31cb9bb0a9 --> ac2a9b92_4484_491e_1b48_ec85e71e1d58
  5b417886_56dd_6afa_13ab_a3cfc1dbcccd["langchain_core.prompt_values"]
  60e64790_a72f_5559_ad87_ee31cb9bb0a9 --> 5b417886_56dd_6afa_13ab_a3cfc1dbcccd
  f4d905c6_a2b2_eb8f_be9b_7808b72f6a16["langchain_core.utils"]
  60e64790_a72f_5559_ad87_ee31cb9bb0a9 --> f4d905c6_a2b2_eb8f_be9b_7808b72f6a16
  afb01135_7b40_b00b_9769_21eebcc09aa3["langchain_core.utils.utils"]
  60e64790_a72f_5559_ad87_ee31cb9bb0a9 --> afb01135_7b40_b00b_9769_21eebcc09aa3
  6e58aaea_f08e_c099_3cc7_f9567bfb1ae7["pydantic"]
  60e64790_a72f_5559_ad87_ee31cb9bb0a9 --> 6e58aaea_f08e_c099_3cc7_f9567bfb1ae7
  91721f45_4909_e489_8c1f_084f8bd87145["typing_extensions"]
  60e64790_a72f_5559_ad87_ee31cb9bb0a9 --> 91721f45_4909_e489_8c1f_084f8bd87145
  style 60e64790_a72f_5559_ad87_ee31cb9bb0a9 fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

"""Anthropic LLM wrapper. Chat models are in `chat_models.py`."""

from __future__ import annotations

import re
import warnings
from collections.abc import AsyncIterator, Callable, Iterator, Mapping
from typing import Any

import anthropic
from langchain_core.callbacks import (
    AsyncCallbackManagerForLLMRun,
    CallbackManagerForLLMRun,
)
from langchain_core.language_models import BaseLanguageModel, LangSmithParams
from langchain_core.language_models.llms import LLM
from langchain_core.outputs import GenerationChunk
from langchain_core.prompt_values import PromptValue
from langchain_core.utils import get_pydantic_field_names
from langchain_core.utils.utils import _build_model_kwargs, from_env, secret_from_env
from pydantic import ConfigDict, Field, SecretStr, model_validator
from typing_extensions import Self


class _AnthropicCommon(BaseLanguageModel):
    client: Any = None

    async_client: Any = None

    model: str = Field(default="claude-sonnet-4-5", alias="model_name")
    """Model name to use."""

    max_tokens: int = Field(default=1024, alias="max_tokens_to_sample")
    """Denotes the number of tokens to predict per generation."""

    temperature: float | None = None
    """A non-negative float that tunes the degree of randomness in generation."""

    top_k: int | None = None
    """Number of most likely tokens to consider at each step."""

    top_p: float | None = None
    """Total probability mass of tokens to consider at each step."""

    streaming: bool = False
    """Whether to stream the results."""

    default_request_timeout: float | None = None
    """Timeout for requests to Anthropic Completion API. Default is 600 seconds."""

    max_retries: int = 2
    """Number of retries allowed for requests sent to the Anthropic Completion API."""

    anthropic_api_url: str | None = Field(
        alias="base_url",
        default_factory=from_env(
            "ANTHROPIC_API_URL",
            default="https://api.anthropic.com",
        ),
    )
// ... (374 more lines)

Subdomains

Dependencies

  • anthropic
  • collections.abc
  • langchain_core.callbacks
  • langchain_core.language_models
  • langchain_core.language_models.llms
  • langchain_core.outputs
  • langchain_core.prompt_values
  • langchain_core.utils
  • langchain_core.utils.utils
  • pydantic
  • re
  • typing
  • typing_extensions
  • warnings

Frequently Asked Questions

What does llms.py do?
llms.py is a source file in the langchain codebase, written in python. It belongs to the CoreAbstractions domain, RunnableInterface subdomain.
What does llms.py depend on?
llms.py imports 14 module(s): anthropic, collections.abc, langchain_core.callbacks, langchain_core.language_models, langchain_core.language_models.llms, langchain_core.outputs, langchain_core.prompt_values, langchain_core.utils, and 6 more.
Where is llms.py in the architecture?
llms.py is located at libs/partners/anthropic/langchain_anthropic/llms.py (domain: CoreAbstractions, subdomain: RunnableInterface, directory: libs/partners/anthropic/langchain_anthropic).

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free