_get_encoding_model() — langchain Function Reference
Architecture documentation for the _get_encoding_model() function in base.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 927425a4_5f3b_5196_e538_3d7cee86141b["_get_encoding_model()"] 2a683305_667b_3567_cab9_9f77e29d4afa["BaseChatOpenAI"] 927425a4_5f3b_5196_e538_3d7cee86141b -->|defined in| 2a683305_667b_3567_cab9_9f77e29d4afa 4844666f_f624_b0a3_473d_64deaf5a46b1["get_token_ids()"] 4844666f_f624_b0a3_473d_64deaf5a46b1 -->|calls| 927425a4_5f3b_5196_e538_3d7cee86141b 3d7b9c9e_fae0_940a_5e96_75a84e2ad11c["get_num_tokens_from_messages()"] 3d7b9c9e_fae0_940a_5e96_75a84e2ad11c -->|calls| 927425a4_5f3b_5196_e538_3d7cee86141b style 927425a4_5f3b_5196_e538_3d7cee86141b fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/openai/langchain_openai/chat_models/base.py lines 1741–1755
def _get_encoding_model(self) -> tuple[str, tiktoken.Encoding]:
if self.tiktoken_model_name is not None:
model = self.tiktoken_model_name
else:
model = self.model_name
try:
encoding = tiktoken.encoding_for_model(model)
except KeyError:
model_lower = model.lower()
encoder = "cl100k_base"
if model_lower.startswith(("gpt-4o", "gpt-4.1", "gpt-5")):
encoder = "o200k_base"
encoding = tiktoken.get_encoding(encoder)
return model, encoding
Domain
Subdomains
Source
Frequently Asked Questions
What does _get_encoding_model() do?
_get_encoding_model() is a function in the langchain codebase, defined in libs/partners/openai/langchain_openai/chat_models/base.py.
Where is _get_encoding_model() defined?
_get_encoding_model() is defined in libs/partners/openai/langchain_openai/chat_models/base.py at line 1741.
What calls _get_encoding_model()?
_get_encoding_model() is called by 2 function(s): get_num_tokens_from_messages, get_token_ids.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free