from_llm() — langchain Function Reference
Architecture documentation for the from_llm() function in eval_chain.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 80297720_85ce_00c4_c038_befbdc6c8ea0["from_llm()"] 8092ea8a_fc95_f2f7_c8e2_4595bb9d5b9e["CotQAEvalChain"] 80297720_85ce_00c4_c038_befbdc6c8ea0 -->|defined in| 8092ea8a_fc95_f2f7_c8e2_4595bb9d5b9e 5e40a8d3_ac36_a29a_8969_75ee97f0c328["_validate_input_vars()"] 80297720_85ce_00c4_c038_befbdc6c8ea0 -->|calls| 5e40a8d3_ac36_a29a_8969_75ee97f0c328 a3155b02_dbf2_ea43_042a_2d3c1404bdeb["from_llm()"] 80297720_85ce_00c4_c038_befbdc6c8ea0 -->|calls| a3155b02_dbf2_ea43_042a_2d3c1404bdeb style 80297720_85ce_00c4_c038_befbdc6c8ea0 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/langchain/langchain_classic/evaluation/qa/eval_chain.py lines 364–373
def from_llm(
cls,
llm: BaseLanguageModel,
prompt: PromptTemplate | None = None,
**kwargs: Any,
) -> CotQAEvalChain:
"""Load QA Eval Chain from LLM."""
prompt = prompt or COT_PROMPT
cls._validate_input_vars(prompt)
return cls(llm=llm, prompt=prompt, **kwargs)
Domain
Subdomains
Source
Frequently Asked Questions
What does from_llm() do?
from_llm() is a function in the langchain codebase, defined in libs/langchain/langchain_classic/evaluation/qa/eval_chain.py.
Where is from_llm() defined?
from_llm() is defined in libs/langchain/langchain_classic/evaluation/qa/eval_chain.py at line 364.
What does from_llm() call?
from_llm() calls 2 function(s): _validate_input_vars, from_llm.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free