Home / Function/ from_llm() — langchain Function Reference

from_llm() — langchain Function Reference

Architecture documentation for the from_llm() function in base.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  a0d1fe2a_c733_6034_7b3d_9e4f623f96c1["from_llm()"]
  bc11cd5d_a081_308b_a325_0c46b7e4deb4["ChatVectorDBChain"]
  a0d1fe2a_c733_6034_7b3d_9e4f623f96c1 -->|defined in| bc11cd5d_a081_308b_a325_0c46b7e4deb4
  6f80bd92_9495_c2eb_f916_2d5aa6519679["from_llm()"]
  6f80bd92_9495_c2eb_f916_2d5aa6519679 -->|calls| a0d1fe2a_c733_6034_7b3d_9e4f623f96c1
  6f80bd92_9495_c2eb_f916_2d5aa6519679["from_llm()"]
  a0d1fe2a_c733_6034_7b3d_9e4f623f96c1 -->|calls| 6f80bd92_9495_c2eb_f916_2d5aa6519679
  style a0d1fe2a_c733_6034_7b3d_9e4f623f96c1 fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

libs/langchain/langchain_classic/chains/conversational_retrieval/base.py lines 549–578

    def from_llm(
        cls,
        llm: BaseLanguageModel,
        vectorstore: VectorStore,
        condense_question_prompt: BasePromptTemplate = CONDENSE_QUESTION_PROMPT,
        chain_type: str = "stuff",
        combine_docs_chain_kwargs: dict | None = None,
        callbacks: Callbacks = None,
        **kwargs: Any,
    ) -> BaseConversationalRetrievalChain:
        """Load chain from LLM."""
        combine_docs_chain_kwargs = combine_docs_chain_kwargs or {}
        doc_chain = load_qa_chain(
            llm,
            chain_type=chain_type,
            callbacks=callbacks,
            **combine_docs_chain_kwargs,
        )
        condense_question_chain = LLMChain(
            llm=llm,
            prompt=condense_question_prompt,
            callbacks=callbacks,
        )
        return cls(
            vectorstore=vectorstore,
            combine_docs_chain=doc_chain,
            question_generator=condense_question_chain,
            callbacks=callbacks,
            **kwargs,
        )

Subdomains

Calls

Called By

Frequently Asked Questions

What does from_llm() do?
from_llm() is a function in the langchain codebase, defined in libs/langchain/langchain_classic/chains/conversational_retrieval/base.py.
Where is from_llm() defined?
from_llm() is defined in libs/langchain/langchain_classic/chains/conversational_retrieval/base.py at line 549.
What does from_llm() call?
from_llm() calls 1 function(s): from_llm.
What calls from_llm()?
from_llm() is called by 1 function(s): from_llm.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free