test_load.py — langchain Source File
Architecture documentation for test_load.py, a python file in the langchain codebase. 5 imports, 0 dependents.
Entity Profile
Dependency Diagram
graph LR f594231b_8e8d_61b8_e688_99b6f31430d1["test_load.py"] 86d015d7_8a78_acc2_abd6_9cb22b0ae1aa["langchain_core.load"] f594231b_8e8d_61b8_e688_99b6f31430d1 --> 86d015d7_8a78_acc2_abd6_9cb22b0ae1aa 16c7d167_e2e4_cd42_2bc2_d182459cd93c["langchain_core.prompts.chat"] f594231b_8e8d_61b8_e688_99b6f31430d1 --> 16c7d167_e2e4_cd42_2bc2_d182459cd93c 4b3dcc0f_d872_0044_39ec_2d289f87f9e6["langchain_core.prompts.prompt"] f594231b_8e8d_61b8_e688_99b6f31430d1 --> 4b3dcc0f_d872_0044_39ec_2d289f87f9e6 31eab4ab_7281_1e6c_b17d_12e6ad9de07a["langchain_core.runnables"] f594231b_8e8d_61b8_e688_99b6f31430d1 --> 31eab4ab_7281_1e6c_b17d_12e6ad9de07a 2cad93e6_586a_5d28_a74d_4ec6fd4d2227["langchain_openai"] f594231b_8e8d_61b8_e688_99b6f31430d1 --> 2cad93e6_586a_5d28_a74d_4ec6fd4d2227 style f594231b_8e8d_61b8_e688_99b6f31430d1 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
from langchain_core.load import dumpd, dumps, load, loads
from langchain_core.prompts.chat import ChatPromptTemplate, HumanMessagePromptTemplate
from langchain_core.prompts.prompt import PromptTemplate
from langchain_core.runnables import RunnableSequence
from langchain_openai import ChatOpenAI, OpenAI
def test_loads_openai_llm() -> None:
llm = OpenAI(model="davinci", temperature=0.5, openai_api_key="hello", top_p=0.8) # type: ignore[call-arg]
llm_string = dumps(llm)
llm2 = loads(
llm_string,
secrets_map={"OPENAI_API_KEY": "hello"},
allowed_objects=[OpenAI],
)
assert llm2.dict() == llm.dict()
llm_string_2 = dumps(llm2)
assert llm_string_2 == llm_string
assert isinstance(llm2, OpenAI)
def test_load_openai_llm() -> None:
llm = OpenAI(model="davinci", temperature=0.5, openai_api_key="hello") # type: ignore[call-arg]
llm_obj = dumpd(llm)
llm2 = load(
llm_obj,
secrets_map={"OPENAI_API_KEY": "hello"},
allowed_objects=[OpenAI],
)
assert llm2.dict() == llm.dict()
assert dumpd(llm2) == llm_obj
assert isinstance(llm2, OpenAI)
def test_loads_openai_chat() -> None:
llm = ChatOpenAI(model="gpt-3.5-turbo", temperature=0.5, openai_api_key="hello") # type: ignore[call-arg]
llm_string = dumps(llm)
llm2 = loads(
llm_string,
secrets_map={"OPENAI_API_KEY": "hello"},
allowed_objects=[ChatOpenAI],
)
assert llm2.dict() == llm.dict()
llm_string_2 = dumps(llm2)
assert llm_string_2 == llm_string
assert isinstance(llm2, ChatOpenAI)
def test_load_openai_chat() -> None:
llm = ChatOpenAI(model="gpt-3.5-turbo", temperature=0.5, openai_api_key="hello") # type: ignore[call-arg]
llm_obj = dumpd(llm)
llm2 = load(
llm_obj,
secrets_map={"OPENAI_API_KEY": "hello"},
allowed_objects=[ChatOpenAI],
)
// ... (78 more lines)
Domain
Subdomains
Functions
Dependencies
- langchain_core.load
- langchain_core.prompts.chat
- langchain_core.prompts.prompt
- langchain_core.runnables
- langchain_openai
Source
Frequently Asked Questions
What does test_load.py do?
test_load.py is a source file in the langchain codebase, written in python. It belongs to the LangChainCore domain, Runnables subdomain.
What functions are defined in test_load.py?
test_load.py defines 6 function(s): test_load_openai_chat, test_load_openai_llm, test_load_runnable_sequence_prompt_model, test_loads_openai_chat, test_loads_openai_llm, test_loads_runnable_sequence_prompt_model.
What does test_load.py depend on?
test_load.py imports 5 module(s): langchain_core.load, langchain_core.prompts.chat, langchain_core.prompts.prompt, langchain_core.runnables, langchain_openai.
Where is test_load.py in the architecture?
test_load.py is located at libs/partners/openai/tests/unit_tests/test_load.py (domain: LangChainCore, subdomain: Runnables, directory: libs/partners/openai/tests/unit_tests).
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free