save_context() — langchain Function Reference
Architecture documentation for the save_context() function in token_buffer.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 0691aa05_346f_c6dd_e63d_0940e98ba0c6["save_context()"] aad5faf6_b420_1801_aeb0_9324a6e4c593["ConversationTokenBufferMemory"] 0691aa05_346f_c6dd_e63d_0940e98ba0c6 -->|defined in| aad5faf6_b420_1801_aeb0_9324a6e4c593 style 0691aa05_346f_c6dd_e63d_0940e98ba0c6 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/langchain/langchain_classic/memory/token_buffer.py lines 61–71
def save_context(self, inputs: dict[str, Any], outputs: dict[str, str]) -> None:
"""Save context from this conversation to buffer. Pruned."""
super().save_context(inputs, outputs)
# Prune buffer if it exceeds max token limit
buffer = self.chat_memory.messages
curr_buffer_length = self.llm.get_num_tokens_from_messages(buffer)
if curr_buffer_length > self.max_token_limit:
pruned_memory = []
while curr_buffer_length > self.max_token_limit:
pruned_memory.append(buffer.pop(0))
curr_buffer_length = self.llm.get_num_tokens_from_messages(buffer)
Domain
Subdomains
Source
Frequently Asked Questions
What does save_context() do?
save_context() is a function in the langchain codebase, defined in libs/langchain/langchain_classic/memory/token_buffer.py.
Where is save_context() defined?
save_context() is defined in libs/langchain/langchain_classic/memory/token_buffer.py at line 61.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free