test_from_llm_uses_supplied_chatopenai() — langchain Function Reference
Architecture documentation for the test_from_llm_uses_supplied_chatopenai() function in test_flare.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD a881cff1_6b47_7a13_0dfb_4eb86ac909eb["test_from_llm_uses_supplied_chatopenai()"] 61b72c8b_1d90_92a9_95b4_7adf79f2a9cf["test_flare.py"] a881cff1_6b47_7a13_0dfb_4eb86ac909eb -->|defined in| 61b72c8b_1d90_92a9_95b4_7adf79f2a9cf style a881cff1_6b47_7a13_0dfb_4eb86ac909eb fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/langchain/tests/unit_tests/chains/test_flare.py lines 34–51
def test_from_llm_uses_supplied_chatopenai(monkeypatch: pytest.MonkeyPatch) -> None:
try:
from langchain_openai import ChatOpenAI
except ImportError: # pragma: no cover
pytest.skip("langchain-openai not installed")
# Provide dummy API key to satisfy constructor env validation.
monkeypatch.setenv("OPENAI_API_KEY", "TEST")
supplied = ChatOpenAI(temperature=0.51, logprobs=True, max_completion_tokens=21)
chain = FlareChain.from_llm(
supplied,
max_generation_len=32,
retriever=_EmptyRetriever(),
)
llm_in_chain = cast("RunnableSequence", chain.question_generator_chain).steps[1]
assert llm_in_chain is supplied
Domain
Subdomains
Source
Frequently Asked Questions
What does test_from_llm_uses_supplied_chatopenai() do?
test_from_llm_uses_supplied_chatopenai() is a function in the langchain codebase, defined in libs/langchain/tests/unit_tests/chains/test_flare.py.
Where is test_from_llm_uses_supplied_chatopenai() defined?
test_from_llm_uses_supplied_chatopenai() is defined in libs/langchain/tests/unit_tests/chains/test_flare.py at line 34.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free