Home / Function/ batch_as_completed() — langchain Function Reference

batch_as_completed() — langchain Function Reference

Architecture documentation for the batch_as_completed() function in base.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  f644d7a4_934e_9a33_5129_33e23ab0574e["batch_as_completed()"]
  70048a7c_06ac_54dc_f2cc_70cbfa8bcbc9["_ConfigurableModel"]
  f644d7a4_934e_9a33_5129_33e23ab0574e -->|defined in| 70048a7c_06ac_54dc_f2cc_70cbfa8bcbc9
  1ec96341_0cad_d4df_0b5f_401dfd4ccd5a["_model()"]
  f644d7a4_934e_9a33_5129_33e23ab0574e -->|calls| 1ec96341_0cad_d4df_0b5f_401dfd4ccd5a
  style f644d7a4_934e_9a33_5129_33e23ab0574e fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

libs/langchain/langchain_classic/chat_models/base.py lines 859–886

    def batch_as_completed(
        self,
        inputs: Sequence[LanguageModelInput],
        config: RunnableConfig | Sequence[RunnableConfig] | None = None,
        *,
        return_exceptions: bool = False,
        **kwargs: Any,
    ) -> Iterator[tuple[int, Any | Exception]]:
        config = config or None
        # If <= 1 config use the underlying models batch implementation.
        if config is None or isinstance(config, dict) or len(config) <= 1:
            if isinstance(config, list):
                config = config[0]
            yield from self._model(cast("RunnableConfig", config)).batch_as_completed(  # type: ignore[call-overload]
                inputs,
                config=config,
                return_exceptions=return_exceptions,
                **kwargs,
            )
        # If multiple configs default to Runnable.batch which uses executor to invoke
        # in parallel.
        else:
            yield from super().batch_as_completed(  # type: ignore[call-overload]
                inputs,
                config=config,
                return_exceptions=return_exceptions,
                **kwargs,
            )

Domain

Subdomains

Calls

Frequently Asked Questions

What does batch_as_completed() do?
batch_as_completed() is a function in the langchain codebase, defined in libs/langchain/langchain_classic/chat_models/base.py.
Where is batch_as_completed() defined?
batch_as_completed() is defined in libs/langchain/langchain_classic/chat_models/base.py at line 859.
What does batch_as_completed() call?
batch_as_completed() calls 1 function(s): _model.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free