test_openai_agent_with_streaming() — langchain Function Reference
Architecture documentation for the test_openai_agent_with_streaming() function in test_agent.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 3fe24fd2_57d5_7278_725d_f532b8fd70bc["test_openai_agent_with_streaming()"] 47a7b285_8e60_f78f_282d_429958c446fa["test_agent.py"] 3fe24fd2_57d5_7278_725d_f532b8fd70bc -->|defined in| 47a7b285_8e60_f78f_282d_429958c446fa 29ab623b_0297_07ba_a564_05f00fcf9b12["_make_func_invocation()"] 3fe24fd2_57d5_7278_725d_f532b8fd70bc -->|calls| 29ab623b_0297_07ba_a564_05f00fcf9b12 d057073c_d436_881b_39c0_9a71bc129dcd["_recursive_dump()"] 3fe24fd2_57d5_7278_725d_f532b8fd70bc -->|calls| d057073c_d436_881b_39c0_9a71bc129dcd style 3fe24fd2_57d5_7278_725d_f532b8fd70bc fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/langchain/tests/unit_tests/agents/test_agent.py lines 831–992
async def test_openai_agent_with_streaming() -> None:
"""Test openai agent with streaming."""
infinite_cycle = cycle(
[
_make_func_invocation("find_pet", pet="cat"),
AIMessage(content="The cat is spying from under the bed."),
],
)
model = GenericFakeChatModel(messages=infinite_cycle)
@tool
def find_pet(pet: str) -> str:
"""Find the given pet."""
if pet != "cat":
msg = "Only cats allowed"
raise ValueError(msg)
return "Spying from under the bed."
template = ChatPromptTemplate.from_messages(
[
("system", "You are a helpful AI bot. Your name is kitty power meow."),
("human", "{question}"),
MessagesPlaceholder(
variable_name="agent_scratchpad",
),
],
)
# type error due to base tool type below -- would need to be adjusted on tool
# decorator.
agent = create_openai_functions_agent(
model,
[find_pet],
template,
)
executor = AgentExecutor(agent=agent, tools=[find_pet])
# Invoke
result = await asyncio.to_thread(executor.invoke, {"question": "hello"})
assert result == {
"output": "The cat is spying from under the bed.",
"question": "hello",
}
# astream
chunks = [chunk async for chunk in executor.astream({"question": "hello"})]
assert _recursive_dump(chunks) == [
{
"actions": [
{
"log": "\nInvoking: `find_pet` with `{'pet': 'cat'}`\n\n\n",
"message_log": [
{
"additional_kwargs": {
"function_call": {
"arguments": '{"pet": "cat"}',
"name": "find_pet",
},
},
"content": "",
"name": None,
"response_metadata": {},
"type": "AIMessageChunk",
},
],
"tool": "find_pet",
"tool_input": {"pet": "cat"},
"type": "AgentActionMessageLog",
},
],
"messages": [
{
"additional_kwargs": {
"function_call": {
"arguments": '{"pet": "cat"}',
"name": "find_pet",
},
},
"chunk_position": "last",
"content": "",
Domain
Subdomains
Source
Frequently Asked Questions
What does test_openai_agent_with_streaming() do?
test_openai_agent_with_streaming() is a function in the langchain codebase, defined in libs/langchain/tests/unit_tests/agents/test_agent.py.
Where is test_openai_agent_with_streaming() defined?
test_openai_agent_with_streaming() is defined in libs/langchain/tests/unit_tests/agents/test_agent.py at line 831.
What does test_openai_agent_with_streaming() call?
test_openai_agent_with_streaming() calls 2 function(s): _make_func_invocation, _recursive_dump.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free