refine_prompts.py — langchain Source File
Architecture documentation for refine_prompts.py, a python file in the langchain codebase. 3 imports, 0 dependents.
Entity Profile
Dependency Diagram
graph LR c0ea54b1_a0b8_10ac_3e7c_7d9368d3beb1["refine_prompts.py"] 16c7d167_e2e4_cd42_2bc2_d182459cd93c["langchain_core.prompts.chat"] c0ea54b1_a0b8_10ac_3e7c_7d9368d3beb1 --> 16c7d167_e2e4_cd42_2bc2_d182459cd93c 4b3dcc0f_d872_0044_39ec_2d289f87f9e6["langchain_core.prompts.prompt"] c0ea54b1_a0b8_10ac_3e7c_7d9368d3beb1 --> 4b3dcc0f_d872_0044_39ec_2d289f87f9e6 19f929ef_6721_dbe8_8478_f6ed6cf3eb7d["langchain_classic.chains.prompt_selector"] c0ea54b1_a0b8_10ac_3e7c_7d9368d3beb1 --> 19f929ef_6721_dbe8_8478_f6ed6cf3eb7d style c0ea54b1_a0b8_10ac_3e7c_7d9368d3beb1 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
from langchain_core.prompts.chat import (
ChatPromptTemplate,
)
from langchain_core.prompts.prompt import PromptTemplate
from langchain_classic.chains.prompt_selector import (
ConditionalPromptSelector,
is_chat_model,
)
DEFAULT_REFINE_PROMPT_TMPL = (
"The original question is as follows: {question}\n"
"We have provided an existing answer: {existing_answer}\n"
"We have the opportunity to refine the existing answer "
"(only if needed) with some more context below.\n"
"------------\n"
"{context_str}\n"
"------------\n"
"Given the new context, refine the original answer to better "
"answer the question. "
"If the context isn't useful, return the original answer."
)
DEFAULT_REFINE_PROMPT = PromptTemplate.from_template(DEFAULT_REFINE_PROMPT_TMPL)
refine_template = (
"We have the opportunity to refine the existing answer "
"(only if needed) with some more context below.\n"
"------------\n"
"{context_str}\n"
"------------\n"
"Given the new context, refine the original answer to better "
"answer the question. "
"If the context isn't useful, return the original answer."
)
CHAT_REFINE_PROMPT = ChatPromptTemplate.from_messages(
[
("human", "{question}"),
("ai", "{existing_answer}"),
("human", refine_template),
]
)
REFINE_PROMPT_SELECTOR = ConditionalPromptSelector(
default_prompt=DEFAULT_REFINE_PROMPT,
conditionals=[(is_chat_model, CHAT_REFINE_PROMPT)],
)
DEFAULT_TEXT_QA_PROMPT_TMPL = (
"Context information is below. \n"
"------------\n"
"{context_str}\n"
"------------\n"
"Given the context information and not prior knowledge, "
"answer the question: {question}\n"
)
DEFAULT_TEXT_QA_PROMPT = PromptTemplate.from_template(DEFAULT_TEXT_QA_PROMPT_TMPL)
chat_qa_prompt_template = (
"Context information is below.\n"
"------------\n"
"{context_str}\n"
"------------\n"
"Given the context information and not prior knowledge, "
"answer any questions"
)
CHAT_QUESTION_PROMPT = ChatPromptTemplate.from_messages(
[
("system", chat_qa_prompt_template),
("human", "{question}"),
]
)
QUESTION_PROMPT_SELECTOR = ConditionalPromptSelector(
default_prompt=DEFAULT_TEXT_QA_PROMPT,
conditionals=[(is_chat_model, CHAT_QUESTION_PROMPT)],
)
Dependencies
- langchain_classic.chains.prompt_selector
- langchain_core.prompts.chat
- langchain_core.prompts.prompt
Source
Frequently Asked Questions
What does refine_prompts.py do?
refine_prompts.py is a source file in the langchain codebase, written in python.
What does refine_prompts.py depend on?
refine_prompts.py imports 3 module(s): langchain_classic.chains.prompt_selector, langchain_core.prompts.chat, langchain_core.prompts.prompt.
Where is refine_prompts.py in the architecture?
refine_prompts.py is located at libs/langchain/langchain_classic/chains/question_answering/refine_prompts.py (directory: libs/langchain/langchain_classic/chains/question_answering).
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free