Home / Class/ ConversationalRetrievalChain Class — langchain Architecture

ConversationalRetrievalChain Class — langchain Architecture

Architecture documentation for the ConversationalRetrievalChain class in base.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  5a2d0e30_7f18_1297_10e7_2a681b9bf9ab["ConversationalRetrievalChain"]
  6b236612_c8e8_0e28_1c54_1e8542b421ec["BaseConversationalRetrievalChain"]
  5a2d0e30_7f18_1297_10e7_2a681b9bf9ab -->|extends| 6b236612_c8e8_0e28_1c54_1e8542b421ec
  de0f008b_5a13_6232_14aa_0d6a4879e132["StuffDocumentsChain"]
  5a2d0e30_7f18_1297_10e7_2a681b9bf9ab -->|extends| de0f008b_5a13_6232_14aa_0d6a4879e132
  0da0ed82_99c0_5721_d306_207eb668ee77["base.py"]
  5a2d0e30_7f18_1297_10e7_2a681b9bf9ab -->|defined in| 0da0ed82_99c0_5721_d306_207eb668ee77
  daa47189_48d3_4e25_0f77_01b6427cea32["_reduce_tokens_below_limit()"]
  5a2d0e30_7f18_1297_10e7_2a681b9bf9ab -->|method| daa47189_48d3_4e25_0f77_01b6427cea32
  1da0fe54_4a79_39af_231a_19ccdec4b565["_get_docs()"]
  5a2d0e30_7f18_1297_10e7_2a681b9bf9ab -->|method| 1da0fe54_4a79_39af_231a_19ccdec4b565
  e2f1575f_4898_a1d2_2a16_e8ee5c8e075e["_aget_docs()"]
  5a2d0e30_7f18_1297_10e7_2a681b9bf9ab -->|method| e2f1575f_4898_a1d2_2a16_e8ee5c8e075e
  ad93493e_6d73_3e31_a385_b6796e9cf3d1["from_llm()"]
  5a2d0e30_7f18_1297_10e7_2a681b9bf9ab -->|method| ad93493e_6d73_3e31_a385_b6796e9cf3d1

Relationship Graph

Source Code

libs/langchain/langchain_classic/chains/conversational_retrieval/base.py lines 263–495

class ConversationalRetrievalChain(BaseConversationalRetrievalChain):
    r"""Chain for having a conversation based on retrieved documents.

    This class is deprecated. See below for an example implementation using
    `create_retrieval_chain`. Additional walkthroughs can be found at
    https://python.langchain.com/docs/use_cases/question_answering/chat_history

    ```python
    from langchain_classic.chains import (
        create_history_aware_retriever,
        create_retrieval_chain,
    )
    from langchain_classic.chains.combine_documents import (
        create_stuff_documents_chain,
    )
    from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
    from langchain_openai import ChatOpenAI

    retriever = ...  # Your retriever

    model = ChatOpenAI()

    # Contextualize question
    contextualize_q_system_prompt = (
        "Given a chat history and the latest user question "
        "which might reference context in the chat history, "
        "formulate a standalone question which can be understood "
        "without the chat history. Do NOT answer the question, just "
        "reformulate it if needed and otherwise return it as is."
    )
    contextualize_q_prompt = ChatPromptTemplate.from_messages(
        [
            ("system", contextualize_q_system_prompt),
            MessagesPlaceholder("chat_history"),
            ("human", "{input}"),
        ]
    )
    history_aware_retriever = create_history_aware_retriever(
        model, retriever, contextualize_q_prompt
    )

    # Answer question
    qa_system_prompt = (
        "You are an assistant for question-answering tasks. Use "
        "the following pieces of retrieved context to answer the "
        "question. If you don't know the answer, just say that you "
        "don't know. Use three sentences maximum and keep the answer "
        "concise."
        "\n\n"
        "{context}"
    )
    qa_prompt = ChatPromptTemplate.from_messages(
        [
            ("system", qa_system_prompt),
            MessagesPlaceholder("chat_history"),
            ("human", "{input}"),
        ]
    )
    # Below we use create_stuff_documents_chain to feed all retrieved context
    # into the LLM. Note that we can also use StuffDocumentsChain and other
    # instances of BaseCombineDocumentsChain.
    question_answer_chain = create_stuff_documents_chain(model, qa_prompt)
    rag_chain = create_retrieval_chain(history_aware_retriever, question_answer_chain)

    # Usage:
    chat_history = []  # Collect chat history here (a sequence of messages)
    rag_chain.invoke({"input": query, "chat_history": chat_history})
    ```

    This chain takes in chat history (a list of messages) and new questions,
    and then returns an answer to that question.
    The algorithm for this chain consists of three parts:

    1. Use the chat history and the new question to create a "standalone question".
        This is done so that this question can be passed into the retrieval step to
        fetch relevant documents. If only the new question was passed in, then relevant
        context may be lacking. If the whole conversation was passed into retrieval,
        there may be unnecessary information there that would distract from retrieval.

    2. This new question is passed to the retriever and relevant documents are
        returned.

Frequently Asked Questions

What is the ConversationalRetrievalChain class?
ConversationalRetrievalChain is a class in the langchain codebase, defined in libs/langchain/langchain_classic/chains/conversational_retrieval/base.py.
Where is ConversationalRetrievalChain defined?
ConversationalRetrievalChain is defined in libs/langchain/langchain_classic/chains/conversational_retrieval/base.py at line 263.
What does ConversationalRetrievalChain extend?
ConversationalRetrievalChain extends BaseConversationalRetrievalChain, StuffDocumentsChain.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free