Home / File/ test_llms.py — langchain Source File

test_llms.py — langchain Source File

Architecture documentation for test_llms.py, a python file in the langchain codebase. 2 imports, 0 dependents.

Entity Profile

Dependency Diagram

graph LR
  310ed540_67b4_ad96_69a0_dc5e9ffc5c34["test_llms.py"]
  120e2591_3e15_b895_72b6_cb26195e40a6["pytest"]
  310ed540_67b4_ad96_69a0_dc5e9ffc5c34 --> 120e2591_3e15_b895_72b6_cb26195e40a6
  9d19aeb9_35aa_ec9e_adca_9ac1aea84236["langchain_fireworks"]
  310ed540_67b4_ad96_69a0_dc5e9ffc5c34 --> 9d19aeb9_35aa_ec9e_adca_9ac1aea84236
  style 310ed540_67b4_ad96_69a0_dc5e9ffc5c34 fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

"""Test Fireworks API wrapper.

In order to run this test, you need to have an Fireworks api key.

You can get it by registering for free at https://api.fireworks.ai/.

A test key can be found at https://api.fireworks.ai/settings/api-keys

You'll then need to set `FIREWORKS_API_KEY` environment variable to your api key.
"""

import pytest as pytest

from langchain_fireworks import Fireworks

_MODEL = "accounts/fireworks/models/deepseek-v3p1"


def test_fireworks_call() -> None:
    """Test simple call to fireworks."""
    llm = Fireworks(
        model=_MODEL,
        temperature=0.2,
        max_tokens=250,
    )
    output = llm.invoke("Say foo:")

    assert llm._llm_type == "fireworks"
    assert isinstance(output, str)
    assert len(output) > 0


async def test_fireworks_acall() -> None:
    """Test simple call to fireworks."""
    llm = Fireworks(
        model=_MODEL,
        temperature=0.2,
        max_tokens=250,
    )
    output = await llm.agenerate(["Say foo:"], stop=["bar"])

    assert llm._llm_type == "fireworks"
    output_text = output.generations[0][0].text
    assert isinstance(output_text, str)
    assert output_text.count("bar") <= 1


def test_stream() -> None:
    """Test streaming tokens from OpenAI."""
    llm = Fireworks(model=_MODEL)

    for token in llm.stream("I'm Pickle Rick"):
        assert isinstance(token, str)


async def test_astream() -> None:
    """Test streaming tokens from OpenAI."""
    llm = Fireworks(model=_MODEL)

    async for token in llm.astream("I'm Pickle Rick"):
        assert isinstance(token, str)


async def test_abatch() -> None:
    """Test streaming tokens from Fireworks."""
    llm = Fireworks(model=_MODEL)

    result = await llm.abatch(["I'm Pickle Rick", "I'm not Pickle Rick"])
    for token in result:
        assert isinstance(token, str)


async def test_abatch_tags() -> None:
    """Test batch tokens from Fireworks."""
    llm = Fireworks(model=_MODEL)

    result = await llm.abatch(
        ["I'm Pickle Rick", "I'm not Pickle Rick"], config={"tags": ["foo"]}
    )
    for token in result:
        assert isinstance(token, str)


def test_batch() -> None:
    """Test batch tokens from Fireworks."""
    llm = Fireworks(model=_MODEL)

    result = llm.batch(["I'm Pickle Rick", "I'm not Pickle Rick"])
    for token in result:
        assert isinstance(token, str)


async def test_ainvoke() -> None:
    """Test invoke tokens from Fireworks."""
    llm = Fireworks(model=_MODEL)

    result = await llm.ainvoke("I'm Pickle Rick", config={"tags": ["foo"]})
    assert isinstance(result, str)


def test_invoke() -> None:
    """Test invoke tokens from Fireworks."""
    llm = Fireworks(model=_MODEL)

    result = llm.invoke("I'm Pickle Rick", config={"tags": ["foo"]})
    assert isinstance(result, str)

Subdomains

Dependencies

  • langchain_fireworks
  • pytest

Frequently Asked Questions

What does test_llms.py do?
test_llms.py is a source file in the langchain codebase, written in python. It belongs to the CoreAbstractions domain, RunnableInterface subdomain.
What functions are defined in test_llms.py?
test_llms.py defines 9 function(s): test_abatch, test_abatch_tags, test_ainvoke, test_astream, test_batch, test_fireworks_acall, test_fireworks_call, test_invoke, test_stream.
What does test_llms.py depend on?
test_llms.py imports 2 module(s): langchain_fireworks, pytest.
Where is test_llms.py in the architecture?
test_llms.py is located at libs/partners/fireworks/tests/integration_tests/test_llms.py (domain: CoreAbstractions, subdomain: RunnableInterface, directory: libs/partners/fireworks/tests/integration_tests).

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free