Home / Function/ test_combine_llm_outputs_with_missing_details() — langchain Function Reference

test_combine_llm_outputs_with_missing_details() — langchain Function Reference

Architecture documentation for the test_combine_llm_outputs_with_missing_details() function in test_chat_models.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  3954f029_5420_d202_8e6f_31b1cf639834["test_combine_llm_outputs_with_missing_details()"]
  5bf2e477_37e0_3e98_4042_bc609f2f7f60["test_chat_models.py"]
  3954f029_5420_d202_8e6f_31b1cf639834 -->|defined in| 5bf2e477_37e0_3e98_4042_bc609f2f7f60
  style 3954f029_5420_d202_8e6f_31b1cf639834 fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

libs/partners/groq/tests/unit_tests/test_chat_models.py lines 913–943

def test_combine_llm_outputs_with_missing_details() -> None:
    """Test _combine_llm_outputs when some outputs have details and others don't."""
    llm = ChatGroq(model="test-model")

    llm_outputs: list[dict[str, Any] | None] = [
        {
            "token_usage": {
                "prompt_tokens": 100,
                "completion_tokens": 50,
                "total_tokens": 150,
            },
            "model_name": "test-model",
        },
        {
            "token_usage": {
                "prompt_tokens": 200,
                "completion_tokens": 100,
                "total_tokens": 300,
                "output_tokens_details": {"reasoning_tokens": 40},
            },
            "model_name": "test-model",
        },
    ]

    result = llm._combine_llm_outputs(llm_outputs)

    assert result["token_usage"]["prompt_tokens"] == 300
    assert result["token_usage"]["completion_tokens"] == 150
    assert result["token_usage"]["total_tokens"] == 450
    assert result["token_usage"]["output_tokens_details"]["reasoning_tokens"] == 40
    assert "input_tokens_details" not in result["token_usage"]

Domain

Subdomains

Frequently Asked Questions

What does test_combine_llm_outputs_with_missing_details() do?
test_combine_llm_outputs_with_missing_details() is a function in the langchain codebase, defined in libs/partners/groq/tests/unit_tests/test_chat_models.py.
Where is test_combine_llm_outputs_with_missing_details() defined?
test_combine_llm_outputs_with_missing_details() is defined in libs/partners/groq/tests/unit_tests/test_chat_models.py at line 913.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free