abatch() — langchain Function Reference
Architecture documentation for the abatch() function in base.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 1829ff07_b1a8_905c_d8f2_b6e368c51f36["abatch()"] 74f697e6_2555_becc_640e_6f41ea707eba["_ConfigurableModel"] 1829ff07_b1a8_905c_d8f2_b6e368c51f36 -->|defined in| 74f697e6_2555_becc_640e_6f41ea707eba 4d79fd76_22aa_8160_64a3_c62bd0bfe875["_model()"] 1829ff07_b1a8_905c_d8f2_b6e368c51f36 -->|calls| 4d79fd76_22aa_8160_64a3_c62bd0bfe875 style 1829ff07_b1a8_905c_d8f2_b6e368c51f36 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/langchain_v1/langchain/chat_models/base.py lines 775–801
async def abatch(
self,
inputs: list[LanguageModelInput],
config: RunnableConfig | list[RunnableConfig] | None = None,
*,
return_exceptions: bool = False,
**kwargs: Any | None,
) -> list[Any]:
config = config or None
# If <= 1 config use the underlying models batch implementation.
if config is None or isinstance(config, dict) or len(config) <= 1:
if isinstance(config, list):
config = config[0]
return await self._model(config).abatch(
inputs,
config=config,
return_exceptions=return_exceptions,
**kwargs,
)
# If multiple configs default to Runnable.batch which uses executor to invoke
# in parallel.
return await super().abatch(
inputs,
config=config,
return_exceptions=return_exceptions,
**kwargs,
)
Domain
Subdomains
Calls
Source
Frequently Asked Questions
What does abatch() do?
abatch() is a function in the langchain codebase, defined in libs/langchain_v1/langchain/chat_models/base.py.
Where is abatch() defined?
abatch() is defined in libs/langchain_v1/langchain/chat_models/base.py at line 775.
What does abatch() call?
abatch() calls 1 function(s): _model.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free