llm_prefix() — langchain Function Reference
Architecture documentation for the llm_prefix() function in base.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 99a35c1c_29a3_6f7e_7378_3927cd5afa50["llm_prefix()"] fe108031_e1a6_516d_06e0_e8bd82ddfd8b["ConversationalAgent"] 99a35c1c_29a3_6f7e_7378_3927cd5afa50 -->|defined in| fe108031_e1a6_516d_06e0_e8bd82ddfd8b style 99a35c1c_29a3_6f7e_7378_3927cd5afa50 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/langchain/langchain_classic/agents/conversational/base.py lines 66–72
def llm_prefix(self) -> str:
"""Prefix to append the llm call with.
Returns:
"Thought: "
"""
return "Thought:"
Domain
Subdomains
Source
Frequently Asked Questions
What does llm_prefix() do?
llm_prefix() is a function in the langchain codebase, defined in libs/langchain/langchain_classic/agents/conversational/base.py.
Where is llm_prefix() defined?
llm_prefix() is defined in libs/langchain/langchain_classic/agents/conversational/base.py at line 66.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free