save() — langchain Function Reference
Architecture documentation for the save() function in llms.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD e8c6aa5d_e4e1_8d81_626b_1edda0d13ebc["save()"] ce4aa464_3868_179e_5d99_df48bc307c5f["BaseLLM"] e8c6aa5d_e4e1_8d81_626b_1edda0d13ebc -->|defined in| ce4aa464_3868_179e_5d99_df48bc307c5f b4a028e5_e42e_3478_739f_03ee8ab9100d["dict()"] e8c6aa5d_e4e1_8d81_626b_1edda0d13ebc -->|calls| b4a028e5_e42e_3478_739f_03ee8ab9100d style e8c6aa5d_e4e1_8d81_626b_1edda0d13ebc fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/core/langchain_core/language_models/llms.py lines 1367–1398
def save(self, file_path: Path | str) -> None:
"""Save the LLM.
Args:
file_path: Path to file to save the LLM to.
Raises:
ValueError: If the file path is not a string or Path object.
Example:
```python
llm.save(file_path="path/llm.yaml")
```
"""
# Convert file to Path object.
save_path = Path(file_path)
directory_path = save_path.parent
directory_path.mkdir(parents=True, exist_ok=True)
# Fetch dictionary to save
prompt_dict = self.dict()
if save_path.suffix == ".json":
with save_path.open("w", encoding="utf-8") as f:
json.dump(prompt_dict, f, indent=4)
elif save_path.suffix.endswith((".yaml", ".yml")):
with save_path.open("w", encoding="utf-8") as f:
yaml.dump(prompt_dict, f, default_flow_style=False)
else:
msg = f"{save_path} must be json or yaml"
raise ValueError(msg)
Domain
Subdomains
Calls
Source
Frequently Asked Questions
What does save() do?
save() is a function in the langchain codebase, defined in libs/core/langchain_core/language_models/llms.py.
Where is save() defined?
save() is defined in libs/core/langchain_core/language_models/llms.py at line 1367.
What does save() call?
save() calls 1 function(s): dict.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free