batch() — langchain Function Reference
Architecture documentation for the batch() function in base.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD d6cc820f_2b37_3a5a_9f10_76a92f8566b6["batch()"] 70048a7c_06ac_54dc_f2cc_70cbfa8bcbc9["_ConfigurableModel"] d6cc820f_2b37_3a5a_9f10_76a92f8566b6 -->|defined in| 70048a7c_06ac_54dc_f2cc_70cbfa8bcbc9 1ec96341_0cad_d4df_0b5f_401dfd4ccd5a["_model()"] d6cc820f_2b37_3a5a_9f10_76a92f8566b6 -->|calls| 1ec96341_0cad_d4df_0b5f_401dfd4ccd5a style d6cc820f_2b37_3a5a_9f10_76a92f8566b6 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/langchain/langchain_classic/chat_models/base.py lines 803–829
def batch(
self,
inputs: list[LanguageModelInput],
config: RunnableConfig | list[RunnableConfig] | None = None,
*,
return_exceptions: bool = False,
**kwargs: Any | None,
) -> list[Any]:
config = config or None
# If <= 1 config use the underlying models batch implementation.
if config is None or isinstance(config, dict) or len(config) <= 1:
if isinstance(config, list):
config = config[0]
return self._model(config).batch(
inputs,
config=config,
return_exceptions=return_exceptions,
**kwargs,
)
# If multiple configs default to Runnable.batch which uses executor to invoke
# in parallel.
return super().batch(
inputs,
config=config,
return_exceptions=return_exceptions,
**kwargs,
)
Domain
Subdomains
Calls
Source
Frequently Asked Questions
What does batch() do?
batch() is a function in the langchain codebase, defined in libs/langchain/langchain_classic/chat_models/base.py.
Where is batch() defined?
batch() is defined in libs/langchain/langchain_classic/chat_models/base.py at line 803.
What does batch() call?
batch() calls 1 function(s): _model.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free