base.py — langchain Source File
Architecture documentation for base.py, a python file in the langchain codebase. 23 imports, 0 dependents.
Entity Profile
Dependency Diagram
graph LR 0da0ed82_99c0_5721_d306_207eb668ee77["base.py"] 614e7b9f_ed51_0780_749c_ff40b74963fc["inspect"] 0da0ed82_99c0_5721_d306_207eb668ee77 --> 614e7b9f_ed51_0780_749c_ff40b74963fc 0c635125_6987_b8b3_7ff7_d60249aecde7["warnings"] 0da0ed82_99c0_5721_d306_207eb668ee77 --> 0c635125_6987_b8b3_7ff7_d60249aecde7 cccbe73e_4644_7211_4d55_e8fb133a8014["abc"] 0da0ed82_99c0_5721_d306_207eb668ee77 --> cccbe73e_4644_7211_4d55_e8fb133a8014 cfe2bde5_180e_e3b0_df2b_55b3ebaca8e7["collections.abc"] 0da0ed82_99c0_5721_d306_207eb668ee77 --> cfe2bde5_180e_e3b0_df2b_55b3ebaca8e7 b6ee5de5_719a_eeb5_1e11_e9c63bc22ef8["pathlib"] 0da0ed82_99c0_5721_d306_207eb668ee77 --> b6ee5de5_719a_eeb5_1e11_e9c63bc22ef8 8e2034b7_ceb8_963f_29fc_2ea6b50ef9b3["typing"] 0da0ed82_99c0_5721_d306_207eb668ee77 --> 8e2034b7_ceb8_963f_29fc_2ea6b50ef9b3 b19a8b7e_fbee_95b1_65b8_509a1ed3cad7["langchain_core._api"] 0da0ed82_99c0_5721_d306_207eb668ee77 --> b19a8b7e_fbee_95b1_65b8_509a1ed3cad7 f3bc7443_c889_119d_0744_aacc3620d8d2["langchain_core.callbacks"] 0da0ed82_99c0_5721_d306_207eb668ee77 --> f3bc7443_c889_119d_0744_aacc3620d8d2 c554676d_b731_47b2_a98f_c1c2d537c0aa["langchain_core.documents"] 0da0ed82_99c0_5721_d306_207eb668ee77 --> c554676d_b731_47b2_a98f_c1c2d537c0aa ba43b74d_3099_7e1c_aac3_cf594720469e["langchain_core.language_models"] 0da0ed82_99c0_5721_d306_207eb668ee77 --> ba43b74d_3099_7e1c_aac3_cf594720469e d758344f_537f_649e_f467_b9d7442e86df["langchain_core.messages"] 0da0ed82_99c0_5721_d306_207eb668ee77 --> d758344f_537f_649e_f467_b9d7442e86df e6b4f61e_7b98_6666_3641_26b069517d4a["langchain_core.prompts"] 0da0ed82_99c0_5721_d306_207eb668ee77 --> e6b4f61e_7b98_6666_3641_26b069517d4a 38bc5323_3713_7377_32f8_091293bea54b["langchain_core.retrievers"] 0da0ed82_99c0_5721_d306_207eb668ee77 --> 38bc5323_3713_7377_32f8_091293bea54b 2ceb1686_0f8c_8ae0_36d1_7c0b702fda1c["langchain_core.runnables"] 0da0ed82_99c0_5721_d306_207eb668ee77 --> 2ceb1686_0f8c_8ae0_36d1_7c0b702fda1c style 0da0ed82_99c0_5721_d306_207eb668ee77 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
"""Chain for chatting with a vector database."""
from __future__ import annotations
import inspect
import warnings
from abc import abstractmethod
from collections.abc import Callable
from pathlib import Path
from typing import Any
from langchain_core._api import deprecated
from langchain_core.callbacks import (
AsyncCallbackManagerForChainRun,
CallbackManagerForChainRun,
Callbacks,
)
from langchain_core.documents import Document
from langchain_core.language_models import BaseLanguageModel
from langchain_core.messages import BaseMessage
from langchain_core.prompts import BasePromptTemplate
from langchain_core.retrievers import BaseRetriever
from langchain_core.runnables import RunnableConfig
from langchain_core.vectorstores import VectorStore
from pydantic import BaseModel, ConfigDict, Field, model_validator
from typing_extensions import override
from langchain_classic.chains.base import Chain
from langchain_classic.chains.combine_documents.base import BaseCombineDocumentsChain
from langchain_classic.chains.combine_documents.stuff import StuffDocumentsChain
from langchain_classic.chains.conversational_retrieval.prompts import (
CONDENSE_QUESTION_PROMPT,
)
from langchain_classic.chains.llm import LLMChain
from langchain_classic.chains.question_answering import load_qa_chain
# Depending on the memory type and configuration, the chat history format may differ.
# This needs to be consolidated.
CHAT_TURN_TYPE = tuple[str, str] | BaseMessage
_ROLE_MAP = {"human": "Human: ", "ai": "Assistant: "}
def _get_chat_history(chat_history: list[CHAT_TURN_TYPE]) -> str:
buffer = ""
for dialogue_turn in chat_history:
if isinstance(dialogue_turn, BaseMessage):
if len(dialogue_turn.content) > 0:
role_prefix = _ROLE_MAP.get(
dialogue_turn.type,
f"{dialogue_turn.type}: ",
)
buffer += f"\n{role_prefix}{dialogue_turn.content}"
elif isinstance(dialogue_turn, tuple):
human = "Human: " + dialogue_turn[0]
ai = "Assistant: " + dialogue_turn[1]
buffer += f"\n{human}\n{ai}"
else:
msg = ( # type: ignore[unreachable]
// ... (519 more lines)
Domain
Subdomains
Functions
Dependencies
- abc
- collections.abc
- inspect
- langchain_classic.chains.base
- langchain_classic.chains.combine_documents.base
- langchain_classic.chains.combine_documents.stuff
- langchain_classic.chains.conversational_retrieval.prompts
- langchain_classic.chains.llm
- langchain_classic.chains.question_answering
- langchain_core._api
- langchain_core.callbacks
- langchain_core.documents
- langchain_core.language_models
- langchain_core.messages
- langchain_core.prompts
- langchain_core.retrievers
- langchain_core.runnables
- langchain_core.vectorstores
- pathlib
- pydantic
- typing
- typing_extensions
- warnings
Source
Frequently Asked Questions
What does base.py do?
base.py is a source file in the langchain codebase, written in python. It belongs to the CoreAbstractions domain, RunnableInterface subdomain.
What functions are defined in base.py?
base.py defines 1 function(s): _get_chat_history.
What does base.py depend on?
base.py imports 23 module(s): abc, collections.abc, inspect, langchain_classic.chains.base, langchain_classic.chains.combine_documents.base, langchain_classic.chains.combine_documents.stuff, langchain_classic.chains.conversational_retrieval.prompts, langchain_classic.chains.llm, and 15 more.
Where is base.py in the architecture?
base.py is located at libs/langchain/langchain_classic/chains/conversational_retrieval/base.py (domain: CoreAbstractions, subdomain: RunnableInterface, directory: libs/langchain/langchain_classic/chains/conversational_retrieval).
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free