Home / Function/ save_context() — langchain Function Reference

save_context() — langchain Function Reference

Architecture documentation for the save_context() function in agent_token_buffer_memory.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  449d8ca3_de3c_50e5_86cb_78b75536307e["save_context()"]
  45d63404_b604_4c09_dcac_77e8840caf9b["AgentTokenBufferMemory"]
  449d8ca3_de3c_50e5_86cb_78b75536307e -->|defined in| 45d63404_b604_4c09_dcac_77e8840caf9b
  style 449d8ca3_de3c_50e5_86cb_78b75536307e fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

libs/langchain/langchain_classic/agents/openai_functions_agent/agent_token_buffer_memory.py lines 75–99

    def save_context(self, inputs: dict[str, Any], outputs: dict[str, Any]) -> None:
        """Save context from this conversation to buffer. Pruned.

        Args:
            inputs: Inputs to the agent.
            outputs: Outputs from the agent.
        """
        input_str, output_str = self._get_input_output(inputs, outputs)
        self.chat_memory.add_messages(input_str)  # type: ignore[arg-type]
        format_to_messages = (
            format_to_tool_messages
            if self.format_as_tools
            else format_to_openai_function_messages
        )
        steps = format_to_messages(outputs[self.intermediate_steps_key])
        for msg in steps:
            self.chat_memory.add_message(msg)
        self.chat_memory.add_messages(output_str)  # type: ignore[arg-type]
        # Prune buffer if it exceeds max token limit
        buffer = self.chat_memory.messages
        curr_buffer_length = self.llm.get_num_tokens_from_messages(buffer)
        if curr_buffer_length > self.max_token_limit:
            while curr_buffer_length > self.max_token_limit:
                buffer.pop(0)
                curr_buffer_length = self.llm.get_num_tokens_from_messages(buffer)

Subdomains

Frequently Asked Questions

What does save_context() do?
save_context() is a function in the langchain codebase, defined in libs/langchain/langchain_classic/agents/openai_functions_agent/agent_token_buffer_memory.py.
Where is save_context() defined?
save_context() is defined in libs/langchain/langchain_classic/agents/openai_functions_agent/agent_token_buffer_memory.py at line 75.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free