Home / File/ base.py — langchain Source File

base.py — langchain Source File

Architecture documentation for base.py, a python file in the langchain codebase. 13 imports, 0 dependents.

File python LangChainCore LanguageModelBase 13 imports 2 functions 2 classes

Entity Profile

Dependency Diagram

graph LR
  bcd5af42_a82b_f85e_ed42_ee1d1f435ba0["base.py"]
  e27da29f_a1f7_49f3_84d5_6be4cb4125c8["logging"]
  bcd5af42_a82b_f85e_ed42_ee1d1f435ba0 --> e27da29f_a1f7_49f3_84d5_6be4cb4125c8
  02625e10_fb78_7ecd_1ee2_105ee470faf5["sys"]
  bcd5af42_a82b_f85e_ed42_ee1d1f435ba0 --> 02625e10_fb78_7ecd_1ee2_105ee470faf5
  2bf6d401_816d_d011_3b05_a6114f55ff58["collections.abc"]
  bcd5af42_a82b_f85e_ed42_ee1d1f435ba0 --> 2bf6d401_816d_d011_3b05_a6114f55ff58
  feec1ec4_6917_867b_d228_b134d0ff8099["typing"]
  bcd5af42_a82b_f85e_ed42_ee1d1f435ba0 --> feec1ec4_6917_867b_d228_b134d0ff8099
  082af17d_b8ac_eccd_d339_93cabe1a9b40["openai"]
  bcd5af42_a82b_f85e_ed42_ee1d1f435ba0 --> 082af17d_b8ac_eccd_d339_93cabe1a9b40
  48f5485f_680a_97b7_bfc7_aff0508d4ca0["tiktoken"]
  bcd5af42_a82b_f85e_ed42_ee1d1f435ba0 --> 48f5485f_680a_97b7_bfc7_aff0508d4ca0
  17a62cb3_fefd_6320_b757_b53bb4a1c661["langchain_core.callbacks"]
  bcd5af42_a82b_f85e_ed42_ee1d1f435ba0 --> 17a62cb3_fefd_6320_b757_b53bb4a1c661
  cacd9d2b_1fd4_731a_85d2_d92516c3b0b3["langchain_core.language_models.llms"]
  bcd5af42_a82b_f85e_ed42_ee1d1f435ba0 --> cacd9d2b_1fd4_731a_85d2_d92516c3b0b3
  4382dc25_6fba_324a_49e2_e9742d579385["langchain_core.outputs"]
  bcd5af42_a82b_f85e_ed42_ee1d1f435ba0 --> 4382dc25_6fba_324a_49e2_e9742d579385
  bd035cf2_5933_bc0f_65e9_0dfe57627ca3["langchain_core.utils"]
  bcd5af42_a82b_f85e_ed42_ee1d1f435ba0 --> bd035cf2_5933_bc0f_65e9_0dfe57627ca3
  084d5bd7_d551_6fa1_0366_461f2835772c["langchain_core.utils.utils"]
  bcd5af42_a82b_f85e_ed42_ee1d1f435ba0 --> 084d5bd7_d551_6fa1_0366_461f2835772c
  dd5e7909_a646_84f1_497b_cae69735550e["pydantic"]
  bcd5af42_a82b_f85e_ed42_ee1d1f435ba0 --> dd5e7909_a646_84f1_497b_cae69735550e
  f85fae70_1011_eaec_151c_4083140ae9e5["typing_extensions"]
  bcd5af42_a82b_f85e_ed42_ee1d1f435ba0 --> f85fae70_1011_eaec_151c_4083140ae9e5
  style bcd5af42_a82b_f85e_ed42_ee1d1f435ba0 fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

"""Base classes for OpenAI large language models. Chat models are in `chat_models/`."""

from __future__ import annotations

import logging
import sys
from collections.abc import AsyncIterator, Callable, Collection, Iterator, Mapping
from typing import Any, Literal

import openai
import tiktoken
from langchain_core.callbacks import (
    AsyncCallbackManagerForLLMRun,
    CallbackManagerForLLMRun,
)
from langchain_core.language_models.llms import BaseLLM
from langchain_core.outputs import Generation, GenerationChunk, LLMResult
from langchain_core.utils import get_pydantic_field_names
from langchain_core.utils.utils import _build_model_kwargs, from_env, secret_from_env
from pydantic import ConfigDict, Field, SecretStr, model_validator
from typing_extensions import Self

logger = logging.getLogger(__name__)


def _update_token_usage(
    keys: set[str], response: dict[str, Any], token_usage: dict[str, Any]
) -> None:
    """Update token usage."""
    _keys_to_use = keys.intersection(response["usage"])
    for _key in _keys_to_use:
        if _key not in token_usage:
            token_usage[_key] = response["usage"][_key]
        else:
            token_usage[_key] += response["usage"][_key]


def _stream_response_to_generation_chunk(
    stream_response: dict[str, Any],
) -> GenerationChunk:
    """Convert a stream response to a generation chunk."""
    if not stream_response["choices"]:
        return GenerationChunk(text="")
    return GenerationChunk(
        text=stream_response["choices"][0]["text"] or "",
        generation_info={
            "finish_reason": stream_response["choices"][0].get("finish_reason", None),
            "logprobs": stream_response["choices"][0].get("logprobs", None),
        },
    )


class BaseOpenAI(BaseLLM):
    """Base OpenAI large language model class.

    Setup:
        Install `langchain-openai` and set environment variable `OPENAI_API_KEY`.

        ```bash
        pip install -U langchain-openai
// ... (813 more lines)

Domain

Subdomains

Dependencies

  • collections.abc
  • langchain_core.callbacks
  • langchain_core.language_models.llms
  • langchain_core.outputs
  • langchain_core.utils
  • langchain_core.utils.utils
  • logging
  • openai
  • pydantic
  • sys
  • tiktoken
  • typing
  • typing_extensions

Frequently Asked Questions

What does base.py do?
base.py is a source file in the langchain codebase, written in python. It belongs to the LangChainCore domain, LanguageModelBase subdomain.
What functions are defined in base.py?
base.py defines 2 function(s): _stream_response_to_generation_chunk, _update_token_usage.
What does base.py depend on?
base.py imports 13 module(s): collections.abc, langchain_core.callbacks, langchain_core.language_models.llms, langchain_core.outputs, langchain_core.utils, langchain_core.utils.utils, logging, openai, and 5 more.
Where is base.py in the architecture?
base.py is located at libs/partners/openai/langchain_openai/llms/base.py (domain: LangChainCore, subdomain: LanguageModelBase, directory: libs/partners/openai/langchain_openai/llms).

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free