AgentTokenBufferMemory Class — langchain Architecture
Architecture documentation for the AgentTokenBufferMemory class in agent_token_buffer_memory.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 45d63404_b604_4c09_dcac_77e8840caf9b["AgentTokenBufferMemory"] b010d3e5_8f8c_da0a_9a18_268dce3d2b0b["BaseChatMemory"] 45d63404_b604_4c09_dcac_77e8840caf9b -->|extends| b010d3e5_8f8c_da0a_9a18_268dce3d2b0b 0c80829c_1f59_d1c2_1c39_188a5508ea5e["agent_token_buffer_memory.py"] 45d63404_b604_4c09_dcac_77e8840caf9b -->|defined in| 0c80829c_1f59_d1c2_1c39_188a5508ea5e 2eb92345_d197_6a5e_a25d_66c5361cc931["buffer()"] 45d63404_b604_4c09_dcac_77e8840caf9b -->|method| 2eb92345_d197_6a5e_a25d_66c5361cc931 9d0453e5_6847_04ba_73d8_0590b5e571f5["memory_variables()"] 45d63404_b604_4c09_dcac_77e8840caf9b -->|method| 9d0453e5_6847_04ba_73d8_0590b5e571f5 fe43e0af_845f_0889_e282_d3d7f3a1dcad["load_memory_variables()"] 45d63404_b604_4c09_dcac_77e8840caf9b -->|method| fe43e0af_845f_0889_e282_d3d7f3a1dcad 449d8ca3_de3c_50e5_86cb_78b75536307e["save_context()"] 45d63404_b604_4c09_dcac_77e8840caf9b -->|method| 449d8ca3_de3c_50e5_86cb_78b75536307e
Relationship Graph
Source Code
libs/langchain/langchain_classic/agents/openai_functions_agent/agent_token_buffer_memory.py lines 16–99
class AgentTokenBufferMemory(BaseChatMemory):
"""Memory used to save agent output AND intermediate steps.
Args:
human_prefix: Prefix for human messages.
ai_prefix: Prefix for AI messages.
llm: Language model.
memory_key: Key to save memory under.
max_token_limit: Maximum number of tokens to keep in the buffer.
Once the buffer exceeds this many tokens, the oldest
messages will be pruned.
return_messages: Whether to return messages.
output_key: Key to save output under.
intermediate_steps_key: Key to save intermediate steps under.
format_as_tools: Whether to format as tools.
"""
human_prefix: str = "Human"
ai_prefix: str = "AI"
llm: BaseLanguageModel
memory_key: str = "history"
max_token_limit: int = 12000
"""The max number of tokens to keep in the buffer.
Once the buffer exceeds this many tokens, the oldest messages will be pruned."""
return_messages: bool = True
output_key: str = "output"
intermediate_steps_key: str = "intermediate_steps"
format_as_tools: bool = False
@property
def buffer(self) -> list[BaseMessage]:
"""String buffer of memory."""
return self.chat_memory.messages
@property
def memory_variables(self) -> list[str]:
"""Always return list of memory variables."""
return [self.memory_key]
@override
def load_memory_variables(self, inputs: dict[str, Any]) -> dict[str, Any]:
"""Return history buffer.
Args:
inputs: Inputs to the agent.
Returns:
A dictionary with the history buffer.
"""
if self.return_messages:
final_buffer: Any = self.buffer
else:
final_buffer = get_buffer_string(
self.buffer,
human_prefix=self.human_prefix,
ai_prefix=self.ai_prefix,
)
return {self.memory_key: final_buffer}
def save_context(self, inputs: dict[str, Any], outputs: dict[str, Any]) -> None:
"""Save context from this conversation to buffer. Pruned.
Args:
inputs: Inputs to the agent.
outputs: Outputs from the agent.
"""
input_str, output_str = self._get_input_output(inputs, outputs)
self.chat_memory.add_messages(input_str) # type: ignore[arg-type]
format_to_messages = (
format_to_tool_messages
if self.format_as_tools
else format_to_openai_function_messages
)
steps = format_to_messages(outputs[self.intermediate_steps_key])
for msg in steps:
self.chat_memory.add_message(msg)
self.chat_memory.add_messages(output_str) # type: ignore[arg-type]
# Prune buffer if it exceeds max token limit
buffer = self.chat_memory.messages
curr_buffer_length = self.llm.get_num_tokens_from_messages(buffer)
if curr_buffer_length > self.max_token_limit:
Defined In
Extends
Source
Frequently Asked Questions
What is the AgentTokenBufferMemory class?
AgentTokenBufferMemory is a class in the langchain codebase, defined in libs/langchain/langchain_classic/agents/openai_functions_agent/agent_token_buffer_memory.py.
Where is AgentTokenBufferMemory defined?
AgentTokenBufferMemory is defined in libs/langchain/langchain_classic/agents/openai_functions_agent/agent_token_buffer_memory.py at line 16.
What does AgentTokenBufferMemory extend?
AgentTokenBufferMemory extends BaseChatMemory.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free