test_openai_async_streaming_callback() — langchain Function Reference
Architecture documentation for the test_openai_async_streaming_callback() function in test_base.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 6148354f_cc8b_b8ab_fd31_32696a826755["test_openai_async_streaming_callback()"] 2edc9f83_2189_a9c8_70f5_10e2e98e8272["test_base.py"] 6148354f_cc8b_b8ab_fd31_32696a826755 -->|defined in| 2edc9f83_2189_a9c8_70f5_10e2e98e8272 style 6148354f_cc8b_b8ab_fd31_32696a826755 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/openai/tests/integration_tests/llms/test_base.py lines 232–247
async def test_openai_async_streaming_callback() -> None:
"""Test that streaming correctly invokes on_llm_new_token callback."""
callback_handler = FakeCallbackHandler()
callback_manager = CallbackManager([callback_handler])
llm = OpenAI(
max_tokens=10,
streaming=True,
temperature=0,
callbacks=callback_manager,
verbose=True,
)
result = await llm.agenerate(["Write me a sentence with 100 words."])
# new client sometimes passes 2 tokens at once
assert callback_handler.llm_streams >= 5
assert isinstance(result, LLMResult)
Domain
Subdomains
Source
Frequently Asked Questions
What does test_openai_async_streaming_callback() do?
test_openai_async_streaming_callback() is a function in the langchain codebase, defined in libs/partners/openai/tests/integration_tests/llms/test_base.py.
Where is test_openai_async_streaming_callback() defined?
test_openai_async_streaming_callback() is defined in libs/partners/openai/tests/integration_tests/llms/test_base.py at line 232.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free