chain_filter.py — langchain Source File
Architecture documentation for chain_filter.py, a python file in the langchain codebase. 13 imports, 0 dependents.
Entity Profile
Dependency Diagram
graph LR f44a9460_d594_ffbb_d8b3_f434774f862a["chain_filter.py"] cfe2bde5_180e_e3b0_df2b_55b3ebaca8e7["collections.abc"] f44a9460_d594_ffbb_d8b3_f434774f862a --> cfe2bde5_180e_e3b0_df2b_55b3ebaca8e7 8e2034b7_ceb8_963f_29fc_2ea6b50ef9b3["typing"] f44a9460_d594_ffbb_d8b3_f434774f862a --> 8e2034b7_ceb8_963f_29fc_2ea6b50ef9b3 f3bc7443_c889_119d_0744_aacc3620d8d2["langchain_core.callbacks"] f44a9460_d594_ffbb_d8b3_f434774f862a --> f3bc7443_c889_119d_0744_aacc3620d8d2 c554676d_b731_47b2_a98f_c1c2d537c0aa["langchain_core.documents"] f44a9460_d594_ffbb_d8b3_f434774f862a --> c554676d_b731_47b2_a98f_c1c2d537c0aa ba43b74d_3099_7e1c_aac3_cf594720469e["langchain_core.language_models"] f44a9460_d594_ffbb_d8b3_f434774f862a --> ba43b74d_3099_7e1c_aac3_cf594720469e 83d7c7fd_1989_762c_9cf3_cecb50ada22b["langchain_core.output_parsers"] f44a9460_d594_ffbb_d8b3_f434774f862a --> 83d7c7fd_1989_762c_9cf3_cecb50ada22b e6b4f61e_7b98_6666_3641_26b069517d4a["langchain_core.prompts"] f44a9460_d594_ffbb_d8b3_f434774f862a --> e6b4f61e_7b98_6666_3641_26b069517d4a 2ceb1686_0f8c_8ae0_36d1_7c0b702fda1c["langchain_core.runnables"] f44a9460_d594_ffbb_d8b3_f434774f862a --> 2ceb1686_0f8c_8ae0_36d1_7c0b702fda1c 2971f9da_6393_a3e3_610e_ace3d35ee978["langchain_core.runnables.config"] f44a9460_d594_ffbb_d8b3_f434774f862a --> 2971f9da_6393_a3e3_610e_ace3d35ee978 6e58aaea_f08e_c099_3cc7_f9567bfb1ae7["pydantic"] f44a9460_d594_ffbb_d8b3_f434774f862a --> 6e58aaea_f08e_c099_3cc7_f9567bfb1ae7 9b4ec80f_d8de_a6e0_4f16_67ba56685088["langchain_classic.chains"] f44a9460_d594_ffbb_d8b3_f434774f862a --> 9b4ec80f_d8de_a6e0_4f16_67ba56685088 224ab9fb_0538_7d3c_bef3_5f1c82d3a53a["langchain_classic.output_parsers.boolean"] f44a9460_d594_ffbb_d8b3_f434774f862a --> 224ab9fb_0538_7d3c_bef3_5f1c82d3a53a 6062ae18_1969_8a3f_672b_418816406b08["langchain_classic.retrievers.document_compressors.chain_filter_prompt"] f44a9460_d594_ffbb_d8b3_f434774f862a --> 6062ae18_1969_8a3f_672b_418816406b08 style f44a9460_d594_ffbb_d8b3_f434774f862a fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
"""Filter that uses an LLM to drop documents that aren't relevant to the query."""
from collections.abc import Callable, Sequence
from typing import Any
from langchain_core.callbacks import Callbacks
from langchain_core.documents import BaseDocumentCompressor, Document
from langchain_core.language_models import BaseLanguageModel
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import BasePromptTemplate, PromptTemplate
from langchain_core.runnables import Runnable
from langchain_core.runnables.config import RunnableConfig
from pydantic import ConfigDict
from langchain_classic.chains import LLMChain
from langchain_classic.output_parsers.boolean import BooleanOutputParser
from langchain_classic.retrievers.document_compressors.chain_filter_prompt import (
prompt_template,
)
def _get_default_chain_prompt() -> PromptTemplate:
return PromptTemplate(
template=prompt_template,
input_variables=["question", "context"],
output_parser=BooleanOutputParser(),
)
def default_get_input(query: str, doc: Document) -> dict[str, Any]:
"""Return the compression chain input."""
return {"question": query, "context": doc.page_content}
class LLMChainFilter(BaseDocumentCompressor):
"""Filter that drops documents that aren't relevant to the query."""
llm_chain: Runnable
"""LLM wrapper to use for filtering documents.
The chain prompt is expected to have a BooleanOutputParser."""
get_input: Callable[[str, Document], dict] = default_get_input
"""Callable for constructing the chain input from the query and a Document."""
model_config = ConfigDict(
arbitrary_types_allowed=True,
)
def compress_documents(
self,
documents: Sequence[Document],
query: str,
callbacks: Callbacks | None = None,
) -> Sequence[Document]:
"""Filter down documents based on their relevance to the query."""
filtered_docs = []
config = RunnableConfig(callbacks=callbacks)
outputs = zip(
self.llm_chain.batch(
// ... (76 more lines)
Domain
Subdomains
Classes
Dependencies
- collections.abc
- langchain_classic.chains
- langchain_classic.output_parsers.boolean
- langchain_classic.retrievers.document_compressors.chain_filter_prompt
- langchain_core.callbacks
- langchain_core.documents
- langchain_core.language_models
- langchain_core.output_parsers
- langchain_core.prompts
- langchain_core.runnables
- langchain_core.runnables.config
- pydantic
- typing
Source
Frequently Asked Questions
What does chain_filter.py do?
chain_filter.py is a source file in the langchain codebase, written in python. It belongs to the CoreAbstractions domain, RunnableInterface subdomain.
What functions are defined in chain_filter.py?
chain_filter.py defines 2 function(s): _get_default_chain_prompt, default_get_input.
What does chain_filter.py depend on?
chain_filter.py imports 13 module(s): collections.abc, langchain_classic.chains, langchain_classic.output_parsers.boolean, langchain_classic.retrievers.document_compressors.chain_filter_prompt, langchain_core.callbacks, langchain_core.documents, langchain_core.language_models, langchain_core.output_parsers, and 5 more.
Where is chain_filter.py in the architecture?
chain_filter.py is located at libs/langchain/langchain_classic/retrievers/document_compressors/chain_filter.py (domain: CoreAbstractions, subdomain: RunnableInterface, directory: libs/langchain/langchain_classic/retrievers/document_compressors).
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free