_get_llm() — langchain Function Reference
Architecture documentation for the _get_llm() function in test_azure.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD ccf90d51_5ff2_7b50_5922_b28b80cca584["_get_llm()"] c413d48d_e43d_eae6_47cb_3eea9394c77c["test_azure.py"] ccf90d51_5ff2_7b50_5922_b28b80cca584 -->|defined in| c413d48d_e43d_eae6_47cb_3eea9394c77c aee16c32_cab1_8ceb_18c9_c82d7d6dc090["llm()"] aee16c32_cab1_8ceb_18c9_c82d7d6dc090 -->|calls| ccf90d51_5ff2_7b50_5922_b28b80cca584 edae9682_b823_6c64_eb9b_a3fb61170219["test_chat_openai_generate()"] edae9682_b823_6c64_eb9b_a3fb61170219 -->|calls| ccf90d51_5ff2_7b50_5922_b28b80cca584 31cbecb0_a9b3_5b97_22c4_886d734e52da["test_chat_openai_multiple_completions()"] 31cbecb0_a9b3_5b97_22c4_886d734e52da -->|calls| ccf90d51_5ff2_7b50_5922_b28b80cca584 cfd81479_eabb_46d3_fba5_81dbf99e9bc9["test_chat_openai_streaming()"] cfd81479_eabb_46d3_fba5_81dbf99e9bc9 -->|calls| ccf90d51_5ff2_7b50_5922_b28b80cca584 9d924a95_2652_a546_2242_dd200471a820["test_chat_openai_streaming_generation_info()"] 9d924a95_2652_a546_2242_dd200471a820 -->|calls| ccf90d51_5ff2_7b50_5922_b28b80cca584 5320ed4f_4f10_b82c_ca4f_a07204e7a212["test_async_chat_openai()"] 5320ed4f_4f10_b82c_ca4f_a07204e7a212 -->|calls| ccf90d51_5ff2_7b50_5922_b28b80cca584 190dd465_6696_f5b0_964a_48ec027dedc9["test_async_chat_openai_streaming()"] 190dd465_6696_f5b0_964a_48ec027dedc9 -->|calls| ccf90d51_5ff2_7b50_5922_b28b80cca584 style ccf90d51_5ff2_7b50_5922_b28b80cca584 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/openai/tests/integration_tests/chat_models/test_azure.py lines 32–39
def _get_llm(**kwargs: Any) -> AzureChatOpenAI:
return AzureChatOpenAI( # type: ignore[call-arg, call-arg, call-arg]
deployment_name=DEPLOYMENT_NAME,
openai_api_version=OPENAI_API_VERSION,
azure_endpoint=OPENAI_API_BASE,
openai_api_key=OPENAI_API_KEY,
**kwargs,
)
Domain
Subdomains
Called By
Source
Frequently Asked Questions
What does _get_llm() do?
_get_llm() is a function in the langchain codebase, defined in libs/partners/openai/tests/integration_tests/chat_models/test_azure.py.
Where is _get_llm() defined?
_get_llm() is defined in libs/partners/openai/tests/integration_tests/chat_models/test_azure.py at line 32.
What calls _get_llm()?
_get_llm() is called by 7 function(s): llm, test_async_chat_openai, test_async_chat_openai_streaming, test_chat_openai_generate, test_chat_openai_multiple_completions, test_chat_openai_streaming, test_chat_openai_streaming_generation_info.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free