Home / Function/ abatch_as_completed() — langchain Function Reference

abatch_as_completed() — langchain Function Reference

Architecture documentation for the abatch_as_completed() function in base.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  5669c7f1_9bf5_fd6a_19ab_d6ebfffad7ff["abatch_as_completed()"]
  74f697e6_2555_becc_640e_6f41ea707eba["_ConfigurableModel"]
  5669c7f1_9bf5_fd6a_19ab_d6ebfffad7ff -->|defined in| 74f697e6_2555_becc_640e_6f41ea707eba
  4d79fd76_22aa_8160_64a3_c62bd0bfe875["_model()"]
  5669c7f1_9bf5_fd6a_19ab_d6ebfffad7ff -->|calls| 4d79fd76_22aa_8160_64a3_c62bd0bfe875
  style 5669c7f1_9bf5_fd6a_19ab_d6ebfffad7ff fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

libs/langchain_v1/langchain/chat_models/base.py lines 832–863

    async def abatch_as_completed(
        self,
        inputs: Sequence[LanguageModelInput],
        config: RunnableConfig | Sequence[RunnableConfig] | None = None,
        *,
        return_exceptions: bool = False,
        **kwargs: Any,
    ) -> AsyncIterator[tuple[int, Any]]:
        config = config or None
        # If <= 1 config use the underlying models batch implementation.
        if config is None or isinstance(config, dict) or len(config) <= 1:
            if isinstance(config, list):
                config = config[0]
            async for x in self._model(
                cast("RunnableConfig", config),
            ).abatch_as_completed(  # type: ignore[call-overload]
                inputs,
                config=config,
                return_exceptions=return_exceptions,
                **kwargs,
            ):
                yield x
        # If multiple configs default to Runnable.batch which uses executor to invoke
        # in parallel.
        else:
            async for x in super().abatch_as_completed(  # type: ignore[call-overload]
                inputs,
                config=config,
                return_exceptions=return_exceptions,
                **kwargs,
            ):
                yield x

Domain

Subdomains

Calls

Frequently Asked Questions

What does abatch_as_completed() do?
abatch_as_completed() is a function in the langchain codebase, defined in libs/langchain_v1/langchain/chat_models/base.py.
Where is abatch_as_completed() defined?
abatch_as_completed() is defined in libs/langchain_v1/langchain/chat_models/base.py at line 832.
What does abatch_as_completed() call?
abatch_as_completed() calls 1 function(s): _model.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free