Home / Class/ BaseConversationalRetrievalChain Class — langchain Architecture

BaseConversationalRetrievalChain Class — langchain Architecture

Architecture documentation for the BaseConversationalRetrievalChain class in base.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  6b236612_c8e8_0e28_1c54_1e8542b421ec["BaseConversationalRetrievalChain"]
  097a4781_5519_0b5d_6244_98c64eadc0d6["Chain"]
  6b236612_c8e8_0e28_1c54_1e8542b421ec -->|extends| 097a4781_5519_0b5d_6244_98c64eadc0d6
  0da0ed82_99c0_5721_d306_207eb668ee77["base.py"]
  6b236612_c8e8_0e28_1c54_1e8542b421ec -->|defined in| 0da0ed82_99c0_5721_d306_207eb668ee77
  317f6c15_d4c6_a239_67e6_9053495a707e["input_keys()"]
  6b236612_c8e8_0e28_1c54_1e8542b421ec -->|method| 317f6c15_d4c6_a239_67e6_9053495a707e
  0c23afe1_9b3b_8410_6af3_6dc61aafd24c["get_input_schema()"]
  6b236612_c8e8_0e28_1c54_1e8542b421ec -->|method| 0c23afe1_9b3b_8410_6af3_6dc61aafd24c
  d7fb3a43_28d3_2c6f_3d4b_c699f62e8993["output_keys()"]
  6b236612_c8e8_0e28_1c54_1e8542b421ec -->|method| d7fb3a43_28d3_2c6f_3d4b_c699f62e8993
  7852048e_347c_94dc_2d26_38ab2f98fe4d["_get_docs()"]
  6b236612_c8e8_0e28_1c54_1e8542b421ec -->|method| 7852048e_347c_94dc_2d26_38ab2f98fe4d
  768e1bd1_e4e5_8673_5245_34bbfa69cdda["_call()"]
  6b236612_c8e8_0e28_1c54_1e8542b421ec -->|method| 768e1bd1_e4e5_8673_5245_34bbfa69cdda
  d49e0f1b_9c2a_78f2_3372_ca2b080b6b83["_aget_docs()"]
  6b236612_c8e8_0e28_1c54_1e8542b421ec -->|method| d49e0f1b_9c2a_78f2_3372_ca2b080b6b83
  35b2b62c_3039_4cca_0edd_ed3db2cdce55["_acall()"]
  6b236612_c8e8_0e28_1c54_1e8542b421ec -->|method| 35b2b62c_3039_4cca_0edd_ed3db2cdce55
  42edd68c_f09b_173e_5bda_3282e8d5886b["save()"]
  6b236612_c8e8_0e28_1c54_1e8542b421ec -->|method| 42edd68c_f09b_173e_5bda_3282e8d5886b

Relationship Graph

Source Code

libs/langchain/langchain_classic/chains/conversational_retrieval/base.py lines 77–252

class BaseConversationalRetrievalChain(Chain):
    """Chain for chatting with an index."""

    combine_docs_chain: BaseCombineDocumentsChain
    """The chain used to combine any retrieved documents."""
    question_generator: LLMChain
    """The chain used to generate a new question for the sake of retrieval.
    This chain will take in the current question (with variable `question`)
    and any chat history (with variable `chat_history`) and will produce
    a new standalone question to be used later on."""
    output_key: str = "answer"
    """The output key to return the final answer of this chain in."""
    rephrase_question: bool = True
    """Whether or not to pass the new generated question to the combine_docs_chain.
    If `True`, will pass the new generated question along.
    If `False`, will only use the new generated question for retrieval and pass the
    original question along to the combine_docs_chain."""
    return_source_documents: bool = False
    """Return the retrieved source documents as part of the final result."""
    return_generated_question: bool = False
    """Return the generated question as part of the final result."""
    get_chat_history: Callable[[list[CHAT_TURN_TYPE]], str] | None = None
    """An optional function to get a string of the chat history.
    If `None` is provided, will use a default."""
    response_if_no_docs_found: str | None = None
    """If specified, the chain will return a fixed response if no docs
    are found for the question. """

    model_config = ConfigDict(
        populate_by_name=True,
        arbitrary_types_allowed=True,
        extra="forbid",
    )

    @property
    def input_keys(self) -> list[str]:
        """Input keys."""
        return ["question", "chat_history"]

    @override
    def get_input_schema(
        self,
        config: RunnableConfig | None = None,
    ) -> type[BaseModel]:
        return InputType

    @property
    def output_keys(self) -> list[str]:
        """Return the output keys."""
        _output_keys = [self.output_key]
        if self.return_source_documents:
            _output_keys = [*_output_keys, "source_documents"]
        if self.return_generated_question:
            _output_keys = [*_output_keys, "generated_question"]
        return _output_keys

    @abstractmethod
    def _get_docs(
        self,
        question: str,
        inputs: dict[str, Any],
        *,
        run_manager: CallbackManagerForChainRun,
    ) -> list[Document]:
        """Get docs."""

    def _call(
        self,
        inputs: dict[str, Any],
        run_manager: CallbackManagerForChainRun | None = None,
    ) -> dict[str, Any]:
        _run_manager = run_manager or CallbackManagerForChainRun.get_noop_manager()
        question = inputs["question"]
        get_chat_history = self.get_chat_history or _get_chat_history
        chat_history_str = get_chat_history(inputs["chat_history"])

        if chat_history_str:
            callbacks = _run_manager.get_child()
            new_question = self.question_generator.run(
                question=question,
                chat_history=chat_history_str,

Extends

Frequently Asked Questions

What is the BaseConversationalRetrievalChain class?
BaseConversationalRetrievalChain is a class in the langchain codebase, defined in libs/langchain/langchain_classic/chains/conversational_retrieval/base.py.
Where is BaseConversationalRetrievalChain defined?
BaseConversationalRetrievalChain is defined in libs/langchain/langchain_classic/chains/conversational_retrieval/base.py at line 77.
What does BaseConversationalRetrievalChain extend?
BaseConversationalRetrievalChain extends Chain.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free