Home / Function/ from_llm() — langchain Function Reference

from_llm() — langchain Function Reference

Architecture documentation for the from_llm() function in base.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  6f80bd92_9495_c2eb_f916_2d5aa6519679["from_llm()"]
  1c559a69_197e_d647_5f47_79dc8fa4a648["ConversationalRetrievalChain"]
  6f80bd92_9495_c2eb_f916_2d5aa6519679 -->|defined in| 1c559a69_197e_d647_5f47_79dc8fa4a648
  a0d1fe2a_c733_6034_7b3d_9e4f623f96c1["from_llm()"]
  a0d1fe2a_c733_6034_7b3d_9e4f623f96c1 -->|calls| 6f80bd92_9495_c2eb_f916_2d5aa6519679
  a0d1fe2a_c733_6034_7b3d_9e4f623f96c1["from_llm()"]
  6f80bd92_9495_c2eb_f916_2d5aa6519679 -->|calls| a0d1fe2a_c733_6034_7b3d_9e4f623f96c1
  style 6f80bd92_9495_c2eb_f916_2d5aa6519679 fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

libs/langchain/langchain_classic/chains/conversational_retrieval/base.py lines 438–495

    def from_llm(
        cls,
        llm: BaseLanguageModel,
        retriever: BaseRetriever,
        condense_question_prompt: BasePromptTemplate = CONDENSE_QUESTION_PROMPT,
        chain_type: str = "stuff",
        verbose: bool = False,  # noqa: FBT001,FBT002
        condense_question_llm: BaseLanguageModel | None = None,
        combine_docs_chain_kwargs: dict | None = None,
        callbacks: Callbacks = None,
        **kwargs: Any,
    ) -> BaseConversationalRetrievalChain:
        """Convenience method to load chain from LLM and retriever.

        This provides some logic to create the `question_generator` chain
        as well as the combine_docs_chain.

        Args:
            llm: The default language model to use at every part of this chain
                (eg in both the question generation and the answering)
            retriever: The retriever to use to fetch relevant documents from.
            condense_question_prompt: The prompt to use to condense the chat history
                and new question into a standalone question.
            chain_type: The chain type to use to create the combine_docs_chain, will
                be sent to `load_qa_chain`.
            verbose: Verbosity flag for logging to stdout.
            condense_question_llm: The language model to use for condensing the chat
                history and new question into a standalone question. If none is
                provided, will default to `llm`.
            combine_docs_chain_kwargs: Parameters to pass as kwargs to `load_qa_chain`
                when constructing the combine_docs_chain.
            callbacks: Callbacks to pass to all subchains.
            kwargs: Additional parameters to pass when initializing
                ConversationalRetrievalChain
        """
        combine_docs_chain_kwargs = combine_docs_chain_kwargs or {}
        doc_chain = load_qa_chain(
            llm,
            chain_type=chain_type,
            verbose=verbose,
            callbacks=callbacks,
            **combine_docs_chain_kwargs,
        )

        _llm = condense_question_llm or llm
        condense_question_chain = LLMChain(
            llm=_llm,
            prompt=condense_question_prompt,
            verbose=verbose,
            callbacks=callbacks,
        )
        return cls(
            retriever=retriever,
            combine_docs_chain=doc_chain,
            question_generator=condense_question_chain,
            callbacks=callbacks,
            **kwargs,
        )

Subdomains

Calls

Called By

Frequently Asked Questions

What does from_llm() do?
from_llm() is a function in the langchain codebase, defined in libs/langchain/langchain_classic/chains/conversational_retrieval/base.py.
Where is from_llm() defined?
from_llm() is defined in libs/langchain/langchain_classic/chains/conversational_retrieval/base.py at line 438.
What does from_llm() call?
from_llm() calls 1 function(s): from_llm.
What calls from_llm()?
from_llm() is called by 1 function(s): from_llm.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free