Home / File/ llms.py — langchain Source File

llms.py — langchain Source File

Architecture documentation for llms.py, a python file in the langchain codebase. 9 imports, 0 dependents.

Entity Profile

Dependency Diagram

graph LR
  0009a0b9_4450_0063_f436_77260938d0c7["llms.py"]
  cfe2bde5_180e_e3b0_df2b_55b3ebaca8e7["collections.abc"]
  0009a0b9_4450_0063_f436_77260938d0c7 --> cfe2bde5_180e_e3b0_df2b_55b3ebaca8e7
  8e2034b7_ceb8_963f_29fc_2ea6b50ef9b3["typing"]
  0009a0b9_4450_0063_f436_77260938d0c7 --> 8e2034b7_ceb8_963f_29fc_2ea6b50ef9b3
  f3bc7443_c889_119d_0744_aacc3620d8d2["langchain_core.callbacks"]
  0009a0b9_4450_0063_f436_77260938d0c7 --> f3bc7443_c889_119d_0744_aacc3620d8d2
  ba43b74d_3099_7e1c_aac3_cf594720469e["langchain_core.language_models"]
  0009a0b9_4450_0063_f436_77260938d0c7 --> ba43b74d_3099_7e1c_aac3_cf594720469e
  ac2a9b92_4484_491e_1b48_ec85e71e1d58["langchain_core.outputs"]
  0009a0b9_4450_0063_f436_77260938d0c7 --> ac2a9b92_4484_491e_1b48_ec85e71e1d58
  e36ef4a1_87ee_d91e_3f75_05e353ec925c["ollama"]
  0009a0b9_4450_0063_f436_77260938d0c7 --> e36ef4a1_87ee_d91e_3f75_05e353ec925c
  6e58aaea_f08e_c099_3cc7_f9567bfb1ae7["pydantic"]
  0009a0b9_4450_0063_f436_77260938d0c7 --> 6e58aaea_f08e_c099_3cc7_f9567bfb1ae7
  91721f45_4909_e489_8c1f_084f8bd87145["typing_extensions"]
  0009a0b9_4450_0063_f436_77260938d0c7 --> 91721f45_4909_e489_8c1f_084f8bd87145
  3caf0eca_9e5b_c2a6_c583_186d108c10fd["langchain_ollama._utils"]
  0009a0b9_4450_0063_f436_77260938d0c7 --> 3caf0eca_9e5b_c2a6_c583_186d108c10fd
  style 0009a0b9_4450_0063_f436_77260938d0c7 fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

"""Ollama large language models."""

from __future__ import annotations

from collections.abc import AsyncIterator, Iterator, Mapping
from typing import Any, Literal

from langchain_core.callbacks import (
    AsyncCallbackManagerForLLMRun,
    CallbackManagerForLLMRun,
)
from langchain_core.language_models import BaseLLM, LangSmithParams
from langchain_core.outputs import GenerationChunk, LLMResult
from ollama import AsyncClient, Client, Options
from pydantic import PrivateAttr, model_validator
from typing_extensions import Self

from langchain_ollama._utils import (
    merge_auth_headers,
    parse_url_with_auth,
    validate_model,
)


class OllamaLLM(BaseLLM):
    """Ollama large language models.

    Setup:
        Install `langchain-ollama` and install/run the Ollama server locally:

        ```bash
        pip install -U langchain-ollama
        # Visit https://ollama.com/download to download and install Ollama
        # (Linux users): start the server with `ollama serve`
        ```

        Download a model to use:

        ```bash
        ollama pull llama3.1
        ```

    Key init args — generation params:
        model: str
            Name of the Ollama model to use (e.g. `'llama4'`).
        temperature: float | None
            Sampling temperature. Higher values make output more creative.
        num_predict: int | None
            Maximum number of tokens to predict.
        top_k: int | None
            Limits the next token selection to the K most probable tokens.
        top_p: float | None
            Nucleus sampling parameter. Higher values lead to more diverse text.
        mirostat: int | None
            Enable Mirostat sampling for controlling perplexity.
        seed: int | None
            Random number seed for generation reproducibility.

    Key init args — client params:
        base_url:
// ... (490 more lines)

Subdomains

Classes

Dependencies

  • collections.abc
  • langchain_core.callbacks
  • langchain_core.language_models
  • langchain_core.outputs
  • langchain_ollama._utils
  • ollama
  • pydantic
  • typing
  • typing_extensions

Frequently Asked Questions

What does llms.py do?
llms.py is a source file in the langchain codebase, written in python. It belongs to the CoreAbstractions domain, RunnableInterface subdomain.
What does llms.py depend on?
llms.py imports 9 module(s): collections.abc, langchain_core.callbacks, langchain_core.language_models, langchain_core.outputs, langchain_ollama._utils, ollama, pydantic, typing, and 1 more.
Where is llms.py in the architecture?
llms.py is located at libs/partners/ollama/langchain_ollama/llms.py (domain: CoreAbstractions, subdomain: RunnableInterface, directory: libs/partners/ollama/langchain_ollama).

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free