Home / File/ huggingface_pipeline.py — langchain Source File

huggingface_pipeline.py — langchain Source File

Architecture documentation for huggingface_pipeline.py, a python file in the langchain codebase. 13 imports, 0 dependents.

Entity Profile

Dependency Diagram

graph LR
  e25ba0f1_eecf_fa24_8dd1_0d43ca217944["huggingface_pipeline.py"]
  01d44b95_02da_6b59_b6ed_842bc82dfa56["importlib.util"]
  e25ba0f1_eecf_fa24_8dd1_0d43ca217944 --> 01d44b95_02da_6b59_b6ed_842bc82dfa56
  2a7f66a7_8738_3d47_375b_70fcaa6ac169["logging"]
  e25ba0f1_eecf_fa24_8dd1_0d43ca217944 --> 2a7f66a7_8738_3d47_375b_70fcaa6ac169
  cfe2bde5_180e_e3b0_df2b_55b3ebaca8e7["collections.abc"]
  e25ba0f1_eecf_fa24_8dd1_0d43ca217944 --> cfe2bde5_180e_e3b0_df2b_55b3ebaca8e7
  8e2034b7_ceb8_963f_29fc_2ea6b50ef9b3["typing"]
  e25ba0f1_eecf_fa24_8dd1_0d43ca217944 --> 8e2034b7_ceb8_963f_29fc_2ea6b50ef9b3
  f3bc7443_c889_119d_0744_aacc3620d8d2["langchain_core.callbacks"]
  e25ba0f1_eecf_fa24_8dd1_0d43ca217944 --> f3bc7443_c889_119d_0744_aacc3620d8d2
  89934eed_a823_2184_acf2_039f48eed5f9["langchain_core.language_models.llms"]
  e25ba0f1_eecf_fa24_8dd1_0d43ca217944 --> 89934eed_a823_2184_acf2_039f48eed5f9
  ac2a9b92_4484_491e_1b48_ec85e71e1d58["langchain_core.outputs"]
  e25ba0f1_eecf_fa24_8dd1_0d43ca217944 --> ac2a9b92_4484_491e_1b48_ec85e71e1d58
  6e58aaea_f08e_c099_3cc7_f9567bfb1ae7["pydantic"]
  e25ba0f1_eecf_fa24_8dd1_0d43ca217944 --> 6e58aaea_f08e_c099_3cc7_f9567bfb1ae7
  a1e869ce_7e7c_ae34_1689_05e9426d927c["langchain_huggingface.utils.import_utils"]
  e25ba0f1_eecf_fa24_8dd1_0d43ca217944 --> a1e869ce_7e7c_ae34_1689_05e9426d927c
  5fa57f2e_a369_d46f_9683_07052dea2f26["transformers"]
  e25ba0f1_eecf_fa24_8dd1_0d43ca217944 --> 5fa57f2e_a369_d46f_9683_07052dea2f26
  a4ac84d5_8fa2_c8e1_3839_44601e812c97["optimum.intel"]
  e25ba0f1_eecf_fa24_8dd1_0d43ca217944 --> a4ac84d5_8fa2_c8e1_3839_44601e812c97
  5b16111a_6dc1_bd5c_28fb_a811c40bca6d["torch"]
  e25ba0f1_eecf_fa24_8dd1_0d43ca217944 --> 5b16111a_6dc1_bd5c_28fb_a811c40bca6d
  242d0b7d_a8ef_b66d_169b_c791b32a9cc9["threading"]
  e25ba0f1_eecf_fa24_8dd1_0d43ca217944 --> 242d0b7d_a8ef_b66d_169b_c791b32a9cc9
  style e25ba0f1_eecf_fa24_8dd1_0d43ca217944 fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

from __future__ import annotations  # type: ignore[import-not-found]

import importlib.util
import logging
from collections.abc import Iterator, Mapping
from typing import Any

from langchain_core.callbacks import CallbackManagerForLLMRun
from langchain_core.language_models.llms import BaseLLM
from langchain_core.outputs import Generation, GenerationChunk, LLMResult
from pydantic import ConfigDict, model_validator

from langchain_huggingface.utils.import_utils import (
    IMPORT_ERROR,
    is_ipex_available,
    is_openvino_available,
    is_optimum_intel_available,
    is_optimum_intel_version,
)

DEFAULT_MODEL_ID = "gpt2"
DEFAULT_TASK = "text-generation"
VALID_TASKS = (
    "text2text-generation",
    "text-generation",
    "image-text-to-text",
    "summarization",
    "translation",
)
DEFAULT_BATCH_SIZE = 4
_MIN_OPTIMUM_VERSION = "1.21"


logger = logging.getLogger(__name__)


class HuggingFacePipeline(BaseLLM):
    """HuggingFace Pipeline API.

    To use, you should have the `transformers` python package installed.

    Only supports `text-generation`, `text2text-generation`, `image-text-to-text`,
    `summarization` and `translation`  for now.

    Example using from_model_id:
        ```python
        from langchain_huggingface import HuggingFacePipeline

        hf = HuggingFacePipeline.from_model_id(
            model_id="gpt2",
            task="text-generation",
            pipeline_kwargs={"max_new_tokens": 10},
        )
        ```

    Example passing pipeline in directly:
        ```python
        from langchain_huggingface import HuggingFacePipeline
        from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline

// ... (363 more lines)

Subdomains

Dependencies

  • collections.abc
  • importlib.util
  • langchain_core.callbacks
  • langchain_core.language_models.llms
  • langchain_core.outputs
  • langchain_huggingface.utils.import_utils
  • logging
  • optimum.intel
  • pydantic
  • threading
  • torch
  • transformers
  • typing

Frequently Asked Questions

What does huggingface_pipeline.py do?
huggingface_pipeline.py is a source file in the langchain codebase, written in python. It belongs to the CoreAbstractions domain, RunnableInterface subdomain.
What does huggingface_pipeline.py depend on?
huggingface_pipeline.py imports 13 module(s): collections.abc, importlib.util, langchain_core.callbacks, langchain_core.language_models.llms, langchain_core.outputs, langchain_huggingface.utils.import_utils, logging, optimum.intel, and 5 more.
Where is huggingface_pipeline.py in the architecture?
huggingface_pipeline.py is located at libs/partners/huggingface/langchain_huggingface/llms/huggingface_pipeline.py (domain: CoreAbstractions, subdomain: RunnableInterface, directory: libs/partners/huggingface/langchain_huggingface/llms).

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free