test_prompt_with_llm_and_async_lambda() — langchain Function Reference
Architecture documentation for the test_prompt_with_llm_and_async_lambda() function in test_runnable.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 96f9ef7f_0308_c4d3_fb2f_8e6c1db1519f["test_prompt_with_llm_and_async_lambda()"] 26df6ad8_0189_51d0_c3c1_6c3248893ff5["test_runnable.py"] 96f9ef7f_0308_c4d3_fb2f_8e6c1db1519f -->|defined in| 26df6ad8_0189_51d0_c3c1_6c3248893ff5 style 96f9ef7f_0308_c4d3_fb2f_8e6c1db1519f fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/core/tests/unit_tests/runnables/test_runnable.py lines 2580–2617
async def test_prompt_with_llm_and_async_lambda(
mocker: MockerFixture, snapshot: SnapshotAssertion
) -> None:
prompt = (
SystemMessagePromptTemplate.from_template("You are a nice assistant.")
+ "{question}"
)
llm = FakeListLLM(responses=["foo", "bar"])
async def passthrough(value: Any) -> Any:
return value
chain = prompt | llm | passthrough
assert isinstance(chain, RunnableSequence)
assert chain.first == prompt
assert chain.middle == [llm]
assert chain.last == RunnableLambda(func=passthrough)
assert dumps(chain, pretty=True) == snapshot
# Test invoke
prompt_spy = mocker.spy(prompt.__class__, "ainvoke")
llm_spy = mocker.spy(llm.__class__, "ainvoke")
tracer = FakeTracer()
assert (
await chain.ainvoke({"question": "What is your name?"}, {"callbacks": [tracer]})
== "foo"
)
assert prompt_spy.call_args.args[1] == {"question": "What is your name?"}
assert llm_spy.call_args.args[1] == ChatPromptValue(
messages=[
SystemMessage(content="You are a nice assistant."),
HumanMessage(content="What is your name?"),
]
)
assert tracer.runs == snapshot
mocker.stop(prompt_spy)
mocker.stop(llm_spy)
Domain
Subdomains
Source
Frequently Asked Questions
What does test_prompt_with_llm_and_async_lambda() do?
test_prompt_with_llm_and_async_lambda() is a function in the langchain codebase, defined in libs/core/tests/unit_tests/runnables/test_runnable.py.
Where is test_prompt_with_llm_and_async_lambda() defined?
test_prompt_with_llm_and_async_lambda() is defined in libs/core/tests/unit_tests/runnables/test_runnable.py at line 2580.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free