_get_ls_params() — langchain Function Reference
Architecture documentation for the _get_ls_params() function in base.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 3f68277e_70a2_a3d7_dd6c_e3ff55aa8706["_get_ls_params()"] 2a683305_667b_3567_cab9_9f77e29d4afa["BaseChatOpenAI"] 3f68277e_70a2_a3d7_dd6c_e3ff55aa8706 -->|defined in| 2a683305_667b_3567_cab9_9f77e29d4afa a10e6ce4_b133_c5d1_0803_af77a79a94f1["_get_invocation_params()"] 3f68277e_70a2_a3d7_dd6c_e3ff55aa8706 -->|calls| a10e6ce4_b133_c5d1_0803_af77a79a94f1 style 3f68277e_70a2_a3d7_dd6c_e3ff55aa8706 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/openai/langchain_openai/chat_models/base.py lines 1714–1731
def _get_ls_params(
self, stop: list[str] | None = None, **kwargs: Any
) -> LangSmithParams:
"""Get standard params for tracing."""
params = self._get_invocation_params(stop=stop, **kwargs)
ls_params = LangSmithParams(
ls_provider="openai",
ls_model_name=params.get("model", self.model_name),
ls_model_type="chat",
ls_temperature=params.get("temperature", self.temperature),
)
if ls_max_tokens := params.get("max_tokens", self.max_tokens) or params.get(
"max_completion_tokens", self.max_tokens
):
ls_params["ls_max_tokens"] = ls_max_tokens
if ls_stop := stop or params.get("stop", None):
ls_params["ls_stop"] = ls_stop
return ls_params
Domain
Subdomains
Calls
Source
Frequently Asked Questions
What does _get_ls_params() do?
_get_ls_params() is a function in the langchain codebase, defined in libs/partners/openai/langchain_openai/chat_models/base.py.
Where is _get_ls_params() defined?
_get_ls_params() is defined in libs/partners/openai/langchain_openai/chat_models/base.py at line 1714.
What does _get_ls_params() call?
_get_ls_params() calls 1 function(s): _get_invocation_params.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free