test_streaming_generation_info() — langchain Function Reference
Architecture documentation for the test_streaming_generation_info() function in test_chat_models.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 0668c062_4348_3c66_cb15_7872fc746a68["test_streaming_generation_info()"] af57ae60_607e_c138_9ab0_fb8bb1c5916a["test_chat_models.py"] 0668c062_4348_3c66_cb15_7872fc746a68 -->|defined in| af57ae60_607e_c138_9ab0_fb8bb1c5916a 57dfdc76_4351_a024_13da_6929ba713661["on_llm_end()"] 0668c062_4348_3c66_cb15_7872fc746a68 -->|calls| 57dfdc76_4351_a024_13da_6929ba713661 style 0668c062_4348_3c66_cb15_7872fc746a68 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/groq/tests/integration_tests/test_chat_models.py lines 366–391
def test_streaming_generation_info() -> None:
"""Test that generation info is preserved when streaming."""
class _FakeCallback(FakeCallbackHandler):
saved_things: dict = {}
def on_llm_end(
self,
*args: Any,
**kwargs: Any,
) -> Any:
# Save the generation
self.saved_things["generation"] = args[0]
callback = _FakeCallback()
chat = ChatGroq(
model="llama-3.1-8b-instant", # Use a model that properly streams content
max_tokens=2,
temperature=0,
callbacks=[callback],
)
list(chat.stream("Respond with the single word Hello", stop=["o"]))
generation = callback.saved_things["generation"]
# `Hello!` is two tokens, assert that is what is returned
assert isinstance(generation, LLMResult)
assert generation.generations[0][0].text == "Hell"
Domain
Subdomains
Calls
Source
Frequently Asked Questions
What does test_streaming_generation_info() do?
test_streaming_generation_info() is a function in the langchain codebase, defined in libs/partners/groq/tests/integration_tests/test_chat_models.py.
Where is test_streaming_generation_info() defined?
test_streaming_generation_info() is defined in libs/partners/groq/tests/integration_tests/test_chat_models.py at line 366.
What does test_streaming_generation_info() call?
test_streaming_generation_info() calls 1 function(s): on_llm_end.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free