Home / File/ test_llms.py — langchain Source File

test_llms.py — langchain Source File

Architecture documentation for test_llms.py, a python file in the langchain codebase. 5 imports, 0 dependents.

File python CoreAbstractions MessageSchema 5 imports 12 functions

Entity Profile

Dependency Diagram

graph LR
  8d3b9273_6b9f_94a1_7c83_b0a38345a7ae["test_llms.py"]
  9e98f0a7_ec6e_708f_4f1b_e9428b316e1c["os"]
  8d3b9273_6b9f_94a1_7c83_b0a38345a7ae --> 9e98f0a7_ec6e_708f_4f1b_e9428b316e1c
  120e2591_3e15_b895_72b6_cb26195e40a6["pytest"]
  8d3b9273_6b9f_94a1_7c83_b0a38345a7ae --> 120e2591_3e15_b895_72b6_cb26195e40a6
  ac2a9b92_4484_491e_1b48_ec85e71e1d58["langchain_core.outputs"]
  8d3b9273_6b9f_94a1_7c83_b0a38345a7ae --> ac2a9b92_4484_491e_1b48_ec85e71e1d58
  2ceb1686_0f8c_8ae0_36d1_7c0b702fda1c["langchain_core.runnables"]
  8d3b9273_6b9f_94a1_7c83_b0a38345a7ae --> 2ceb1686_0f8c_8ae0_36d1_7c0b702fda1c
  b44d4491_221c_f696_01cc_f4c388dae365["langchain_ollama.llms"]
  8d3b9273_6b9f_94a1_7c83_b0a38345a7ae --> b44d4491_221c_f696_01cc_f4c388dae365
  style 8d3b9273_6b9f_94a1_7c83_b0a38345a7ae fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

"""Test OllamaLLM llm."""

import os

import pytest
from langchain_core.outputs import GenerationChunk
from langchain_core.runnables import RunnableConfig

from langchain_ollama.llms import OllamaLLM

MODEL_NAME = os.environ.get("OLLAMA_TEST_MODEL", "llama3.1")
REASONING_MODEL_NAME = os.environ.get("OLLAMA_REASONING_TEST_MODEL", "deepseek-r1:1.5b")
SAMPLE = "What is 3^3?"


def test_invoke() -> None:
    """Test sync invoke returning a string."""
    llm = OllamaLLM(model=MODEL_NAME)
    result = llm.invoke("I'm Pickle Rick", config=RunnableConfig(tags=["foo"]))
    assert isinstance(result, str)


async def test_ainvoke() -> None:
    """Test async invoke returning a string."""
    llm = OllamaLLM(model=MODEL_NAME)

    result = await llm.ainvoke("I'm Pickle Rick", config=RunnableConfig(tags=["foo"]))
    assert isinstance(result, str)


def test_batch() -> None:
    """Test batch sync token generation from `OllamaLLM`."""
    llm = OllamaLLM(model=MODEL_NAME)

    result = llm.batch(["I'm Pickle Rick", "I'm not Pickle Rick"])
    for token in result:
        assert isinstance(token, str)


async def test_abatch() -> None:
    """Test batch async token generation from `OllamaLLM`."""
    llm = OllamaLLM(model=MODEL_NAME)

    result = await llm.abatch(["I'm Pickle Rick", "I'm not Pickle Rick"])
    for token in result:
        assert isinstance(token, str)


def test_batch_tags() -> None:
    """Test batch sync token generation with tags."""
    llm = OllamaLLM(model=MODEL_NAME)

    result = llm.batch(
        ["I'm Pickle Rick", "I'm not Pickle Rick"], config={"tags": ["foo"]}
    )
    for token in result:
        assert isinstance(token, str)


async def test_abatch_tags() -> None:
// ... (121 more lines)

Subdomains

Dependencies

  • langchain_core.outputs
  • langchain_core.runnables
  • langchain_ollama.llms
  • os
  • pytest

Frequently Asked Questions

What does test_llms.py do?
test_llms.py is a source file in the langchain codebase, written in python. It belongs to the CoreAbstractions domain, MessageSchema subdomain.
What functions are defined in test_llms.py?
test_llms.py defines 12 function(s): test__astream_no_reasoning, test__astream_with_reasoning, test__stream_no_reasoning, test__stream_with_reasoning, test_abatch, test_abatch_tags, test_ainvoke, test_astream_text_tokens, test_batch, test_batch_tags, and 2 more.
What does test_llms.py depend on?
test_llms.py imports 5 module(s): langchain_core.outputs, langchain_core.runnables, langchain_ollama.llms, os, pytest.
Where is test_llms.py in the architecture?
test_llms.py is located at libs/partners/ollama/tests/integration_tests/test_llms.py (domain: CoreAbstractions, subdomain: MessageSchema, directory: libs/partners/ollama/tests/integration_tests).

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free