Home / Function/ batch_as_completed() — langchain Function Reference

batch_as_completed() — langchain Function Reference

Architecture documentation for the batch_as_completed() function in base.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  2ecc44f6_41ff_4e91_2b6c_cef50b4c357c["batch_as_completed()"]
  74f697e6_2555_becc_640e_6f41ea707eba["_ConfigurableModel"]
  2ecc44f6_41ff_4e91_2b6c_cef50b4c357c -->|defined in| 74f697e6_2555_becc_640e_6f41ea707eba
  4d79fd76_22aa_8160_64a3_c62bd0bfe875["_model()"]
  2ecc44f6_41ff_4e91_2b6c_cef50b4c357c -->|calls| 4d79fd76_22aa_8160_64a3_c62bd0bfe875
  style 2ecc44f6_41ff_4e91_2b6c_cef50b4c357c fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

libs/langchain_v1/langchain/chat_models/base.py lines 803–830

    def batch_as_completed(
        self,
        inputs: Sequence[LanguageModelInput],
        config: RunnableConfig | Sequence[RunnableConfig] | None = None,
        *,
        return_exceptions: bool = False,
        **kwargs: Any,
    ) -> Iterator[tuple[int, Any | Exception]]:
        config = config or None
        # If <= 1 config use the underlying models batch implementation.
        if config is None or isinstance(config, dict) or len(config) <= 1:
            if isinstance(config, list):
                config = config[0]
            yield from self._model(cast("RunnableConfig", config)).batch_as_completed(  # type: ignore[call-overload]
                inputs,
                config=config,
                return_exceptions=return_exceptions,
                **kwargs,
            )
        # If multiple configs default to Runnable.batch which uses executor to invoke
        # in parallel.
        else:
            yield from super().batch_as_completed(  # type: ignore[call-overload]
                inputs,
                config=config,
                return_exceptions=return_exceptions,
                **kwargs,
            )

Domain

Subdomains

Calls

Frequently Asked Questions

What does batch_as_completed() do?
batch_as_completed() is a function in the langchain codebase, defined in libs/langchain_v1/langchain/chat_models/base.py.
Where is batch_as_completed() defined?
batch_as_completed() is defined in libs/langchain_v1/langchain/chat_models/base.py at line 803.
What does batch_as_completed() call?
batch_as_completed() calls 1 function(s): _model.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free