prompt.py — langchain Source File
Architecture documentation for prompt.py, a python file in the langchain codebase. 3 imports, 0 dependents.
Entity Profile
Dependency Diagram
graph LR 827c3b83_f2fd_9b59_d130_e76b845c5aa7["prompt.py"] 16c7d167_e2e4_cd42_2bc2_d182459cd93c["langchain_core.prompts.chat"] 827c3b83_f2fd_9b59_d130_e76b845c5aa7 --> 16c7d167_e2e4_cd42_2bc2_d182459cd93c 4b3dcc0f_d872_0044_39ec_2d289f87f9e6["langchain_core.prompts.prompt"] 827c3b83_f2fd_9b59_d130_e76b845c5aa7 --> 4b3dcc0f_d872_0044_39ec_2d289f87f9e6 19f929ef_6721_dbe8_8478_f6ed6cf3eb7d["langchain_classic.chains.prompt_selector"] 827c3b83_f2fd_9b59_d130_e76b845c5aa7 --> 19f929ef_6721_dbe8_8478_f6ed6cf3eb7d style 827c3b83_f2fd_9b59_d130_e76b845c5aa7 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
from langchain_core.prompts.chat import (
ChatPromptTemplate,
HumanMessagePromptTemplate,
SystemMessagePromptTemplate,
)
from langchain_core.prompts.prompt import PromptTemplate
from langchain_classic.chains.prompt_selector import (
ConditionalPromptSelector,
is_chat_model,
)
templ1 = """You are a smart assistant designed to help high school teachers come up with reading comprehension questions.
Given a piece of text, you must come up with a question and answer pair that can be used to test a student's reading comprehension abilities.
When coming up with this question/answer pair, you must respond in the following format:
```
{{
"question": "$YOUR_QUESTION_HERE",
"answer": "$THE_ANSWER_HERE"
}}
```
Everything between the ``` must be valid json.
""" # noqa: E501
templ2 = """Please come up with a question/answer pair, in the specified JSON format, for the following text:
----------------
{text}""" # noqa: E501
CHAT_PROMPT = ChatPromptTemplate.from_messages(
[
SystemMessagePromptTemplate.from_template(templ1),
HumanMessagePromptTemplate.from_template(templ2),
]
)
templ = """You are a smart assistant designed to help high school teachers come up with reading comprehension questions.
Given a piece of text, you must come up with a question and answer pair that can be used to test a student's reading comprehension abilities.
When coming up with this question/answer pair, you must respond in the following format:
```
{{
"question": "$YOUR_QUESTION_HERE",
"answer": "$THE_ANSWER_HERE"
}}
```
Everything between the ``` must be valid json.
Please come up with a question/answer pair, in the specified JSON format, for the following text:
----------------
{text}""" # noqa: E501
PROMPT = PromptTemplate.from_template(templ)
PROMPT_SELECTOR = ConditionalPromptSelector(
default_prompt=PROMPT, conditionals=[(is_chat_model, CHAT_PROMPT)]
)
Dependencies
- langchain_classic.chains.prompt_selector
- langchain_core.prompts.chat
- langchain_core.prompts.prompt
Source
Frequently Asked Questions
What does prompt.py do?
prompt.py is a source file in the langchain codebase, written in python.
What does prompt.py depend on?
prompt.py imports 3 module(s): langchain_classic.chains.prompt_selector, langchain_core.prompts.chat, langchain_core.prompts.prompt.
Where is prompt.py in the architecture?
prompt.py is located at libs/langchain/langchain_classic/chains/qa_generation/prompt.py (directory: libs/langchain/langchain_classic/chains/qa_generation).
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free