pyspark_dataframe.py — langchain Source File
Architecture documentation for pyspark_dataframe.py, a python file in the langchain codebase. 3 imports, 0 dependents.
Entity Profile
Dependency Diagram
graph LR 93f7630b_1137_9c8e_c9f5_4ce92df95f21["pyspark_dataframe.py"] 8e2034b7_ceb8_963f_29fc_2ea6b50ef9b3["typing"] 93f7630b_1137_9c8e_c9f5_4ce92df95f21 --> 8e2034b7_ceb8_963f_29fc_2ea6b50ef9b3 439a4142_6fa6_fe9a_2cba_7c9fb0cdceb7["langchain_classic._api"] 93f7630b_1137_9c8e_c9f5_4ce92df95f21 --> 439a4142_6fa6_fe9a_2cba_7c9fb0cdceb7 6f409e37_ea52_6777_4754_7d69799f7df3["langchain_community.document_loaders.pyspark_dataframe"] 93f7630b_1137_9c8e_c9f5_4ce92df95f21 --> 6f409e37_ea52_6777_4754_7d69799f7df3 style 93f7630b_1137_9c8e_c9f5_4ce92df95f21 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
from typing import TYPE_CHECKING, Any
from langchain_classic._api import create_importer
if TYPE_CHECKING:
from langchain_community.document_loaders.pyspark_dataframe import (
PySparkDataFrameLoader,
)
# Create a way to dynamically look up deprecated imports.
# Used to consolidate logic for raising deprecation warnings and
# handling optional imports.
DEPRECATED_LOOKUP = {
"PySparkDataFrameLoader": "langchain_community.document_loaders.pyspark_dataframe",
}
_import_attribute = create_importer(__package__, deprecated_lookups=DEPRECATED_LOOKUP)
def __getattr__(name: str) -> Any:
"""Look up attributes dynamically."""
return _import_attribute(name)
__all__ = ["PySparkDataFrameLoader"]
Domain
Subdomains
Functions
Dependencies
- langchain_classic._api
- langchain_community.document_loaders.pyspark_dataframe
- typing
Source
Frequently Asked Questions
What does pyspark_dataframe.py do?
pyspark_dataframe.py is a source file in the langchain codebase, written in python. It belongs to the CoreAbstractions domain, Serialization subdomain.
What functions are defined in pyspark_dataframe.py?
pyspark_dataframe.py defines 2 function(s): __getattr__, langchain_community.
What does pyspark_dataframe.py depend on?
pyspark_dataframe.py imports 3 module(s): langchain_classic._api, langchain_community.document_loaders.pyspark_dataframe, typing.
Where is pyspark_dataframe.py in the architecture?
pyspark_dataframe.py is located at libs/langchain/langchain_classic/document_loaders/pyspark_dataframe.py (domain: CoreAbstractions, subdomain: Serialization, directory: libs/langchain/langchain_classic/document_loaders).
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free