_load_chat_prompt() — langchain Function Reference
Architecture documentation for the _load_chat_prompt() function in loading.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD dbcccaa3_06ce_9980_1c9a_3460aa8ea3c6["_load_chat_prompt()"] fc5c0c58_eb79_058b_cc17_205426e210b8["loading.py"] dbcccaa3_06ce_9980_1c9a_3460aa8ea3c6 -->|defined in| fc5c0c58_eb79_058b_cc17_205426e210b8 style dbcccaa3_06ce_9980_1c9a_3460aa8ea3c6 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/core/langchain_core/prompts/loading.py lines 180–190
def _load_chat_prompt(config: dict) -> ChatPromptTemplate:
"""Load chat prompt from config."""
messages = config.pop("messages")
template = messages[0]["prompt"].pop("template") if messages else None
config.pop("input_variables")
if not template:
msg = "Can't load chat prompt without template"
raise ValueError(msg)
return ChatPromptTemplate.from_template(template=template, **config)
Domain
Subdomains
Defined In
Source
Frequently Asked Questions
What does _load_chat_prompt() do?
_load_chat_prompt() is a function in the langchain codebase, defined in libs/core/langchain_core/prompts/loading.py.
Where is _load_chat_prompt() defined?
_load_chat_prompt() is defined in libs/core/langchain_core/prompts/loading.py at line 180.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free