QAGenerationChain Class — langchain Architecture
Architecture documentation for the QAGenerationChain class in base.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 2ce04a9b_5637_00e2_15a2_00f9ad51f684["QAGenerationChain"] f3cef70e_11b0_61c9_7ec0_7308f4b45056["Chain"] 2ce04a9b_5637_00e2_15a2_00f9ad51f684 -->|extends| f3cef70e_11b0_61c9_7ec0_7308f4b45056 abcb2fe7_a066_23d4_b7f5_15af8ffd2920["base.py"] 2ce04a9b_5637_00e2_15a2_00f9ad51f684 -->|defined in| abcb2fe7_a066_23d4_b7f5_15af8ffd2920 610e495a_2355_f81c_2c3f_0bfb090f206d["from_llm()"] 2ce04a9b_5637_00e2_15a2_00f9ad51f684 -->|method| 610e495a_2355_f81c_2c3f_0bfb090f206d b924f598_539c_def7_0504_069160fb0d62["_chain_type()"] 2ce04a9b_5637_00e2_15a2_00f9ad51f684 -->|method| b924f598_539c_def7_0504_069160fb0d62 045d06b7_df8e_4a58_b958_b4e00a039900["input_keys()"] 2ce04a9b_5637_00e2_15a2_00f9ad51f684 -->|method| 045d06b7_df8e_4a58_b958_b4e00a039900 17bb8513_b3d0_a798_7e2f_d9eaa8f8ad95["output_keys()"] 2ce04a9b_5637_00e2_15a2_00f9ad51f684 -->|method| 17bb8513_b3d0_a798_7e2f_d9eaa8f8ad95 62e4d560_1563_c959_ce77_10d90ce81c1f["_call()"] 2ce04a9b_5637_00e2_15a2_00f9ad51f684 -->|method| 62e4d560_1563_c959_ce77_10d90ce81c1f
Relationship Graph
Source Code
libs/langchain/langchain_classic/chains/qa_generation/base.py lines 27–127
class QAGenerationChain(Chain):
"""Base class for question-answer generation chains.
This class is deprecated. See below for an alternative implementation.
Advantages of this implementation include:
- Supports async and streaming;
- Surfaces prompt and text splitter for easier customization;
- Use of JsonOutputParser supports JSONPatch operations in streaming mode,
as well as robustness to markdown.
```python
from langchain_classic.chains.qa_generation.prompt import (
CHAT_PROMPT as prompt,
)
# Note: import PROMPT if using a legacy non-chat model.
from langchain_core.output_parsers import JsonOutputParser
from langchain_core.runnables import (
RunnableLambda,
RunnableParallel,
RunnablePassthrough,
)
from langchain_core.runnables.base import RunnableEach
from langchain_openai import ChatOpenAI
from langchain_text_splitters import RecursiveCharacterTextSplitter
model = ChatOpenAI()
text_splitter = RecursiveCharacterTextSplitter(chunk_overlap=500)
split_text = RunnableLambda(lambda x: text_splitter.create_documents([x]))
chain = RunnableParallel(
text=RunnablePassthrough(),
questions=(
split_text | RunnableEach(bound=prompt | model | JsonOutputParser())
),
)
```
"""
llm_chain: LLMChain
"""LLM Chain that generates responses from user input and context."""
text_splitter: TextSplitter = Field(
default=RecursiveCharacterTextSplitter(chunk_overlap=500),
)
"""Text splitter that splits the input into chunks."""
input_key: str = "text"
"""Key of the input to the chain."""
output_key: str = "questions"
"""Key of the output of the chain."""
k: int | None = None
"""Number of questions to generate."""
@classmethod
def from_llm(
cls,
llm: BaseLanguageModel,
prompt: BasePromptTemplate | None = None,
**kwargs: Any,
) -> QAGenerationChain:
"""Create a QAGenerationChain from a language model.
Args:
llm: a language model
prompt: a prompt template
**kwargs: additional arguments
Returns:
a QAGenerationChain class
"""
_prompt = prompt or PROMPT_SELECTOR.get_prompt(llm)
chain = LLMChain(llm=llm, prompt=_prompt)
return cls(llm_chain=chain, **kwargs)
@property
def _chain_type(self) -> str:
raise NotImplementedError
@property
@override
Extends
Source
Frequently Asked Questions
What is the QAGenerationChain class?
QAGenerationChain is a class in the langchain codebase, defined in libs/langchain/langchain_classic/chains/qa_generation/base.py.
Where is QAGenerationChain defined?
QAGenerationChain is defined in libs/langchain/langchain_classic/chains/qa_generation/base.py at line 27.
What does QAGenerationChain extend?
QAGenerationChain extends Chain.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free