test_combine_llm_outputs_with_token_details() — langchain Function Reference
Architecture documentation for the test_combine_llm_outputs_with_token_details() function in test_chat_models.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 4d9a6b7d_b488_60b2_7451_f6ec7dca24f8["test_combine_llm_outputs_with_token_details()"] 5bf2e477_37e0_3e98_4042_bc609f2f7f60["test_chat_models.py"] 4d9a6b7d_b488_60b2_7451_f6ec7dca24f8 -->|defined in| 5bf2e477_37e0_3e98_4042_bc609f2f7f60 style 4d9a6b7d_b488_60b2_7451_f6ec7dca24f8 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/groq/tests/unit_tests/test_chat_models.py lines 873–910
def test_combine_llm_outputs_with_token_details() -> None:
"""Test that _combine_llm_outputs properly combines nested token details."""
llm = ChatGroq(model="test-model")
llm_outputs: list[dict[str, Any] | None] = [
{
"token_usage": {
"prompt_tokens": 100,
"completion_tokens": 50,
"total_tokens": 150,
"input_tokens_details": {"cached_tokens": 80},
"output_tokens_details": {"reasoning_tokens": 20},
},
"model_name": "test-model",
"system_fingerprint": "fp_123",
},
{
"token_usage": {
"prompt_tokens": 200,
"completion_tokens": 100,
"total_tokens": 300,
"input_tokens_details": {"cached_tokens": 150},
"output_tokens_details": {"reasoning_tokens": 40},
},
"model_name": "test-model",
"system_fingerprint": "fp_123",
},
]
result = llm._combine_llm_outputs(llm_outputs)
assert result["token_usage"]["prompt_tokens"] == 300
assert result["token_usage"]["completion_tokens"] == 150
assert result["token_usage"]["total_tokens"] == 450
assert result["token_usage"]["input_tokens_details"]["cached_tokens"] == 230
assert result["token_usage"]["output_tokens_details"]["reasoning_tokens"] == 60
assert result["model_name"] == "test-model"
assert result["system_fingerprint"] == "fp_123"
Domain
Subdomains
Source
Frequently Asked Questions
What does test_combine_llm_outputs_with_token_details() do?
test_combine_llm_outputs_with_token_details() is a function in the langchain codebase, defined in libs/partners/groq/tests/unit_tests/test_chat_models.py.
Where is test_combine_llm_outputs_with_token_details() defined?
test_combine_llm_outputs_with_token_details() is defined in libs/partners/groq/tests/unit_tests/test_chat_models.py at line 873.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free