Home / Class/ ConversationTokenBufferMemory Class — langchain Architecture

ConversationTokenBufferMemory Class — langchain Architecture

Architecture documentation for the ConversationTokenBufferMemory class in token_buffer.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  aad5faf6_b420_1801_aeb0_9324a6e4c593["ConversationTokenBufferMemory"]
  48fb025a_9570_1ddd_d527_ec72c1ae6a6f["BaseChatMemory"]
  aad5faf6_b420_1801_aeb0_9324a6e4c593 -->|extends| 48fb025a_9570_1ddd_d527_ec72c1ae6a6f
  bdf2b296_59db_964d_8ac2_169dca431ea8["token_buffer.py"]
  aad5faf6_b420_1801_aeb0_9324a6e4c593 -->|defined in| bdf2b296_59db_964d_8ac2_169dca431ea8
  ef45a3c6_4453_490a_66fd_c803bb597ba5["buffer()"]
  aad5faf6_b420_1801_aeb0_9324a6e4c593 -->|method| ef45a3c6_4453_490a_66fd_c803bb597ba5
  27962678_c1e0_28d5_c9ad_e72077ac081d["buffer_as_str()"]
  aad5faf6_b420_1801_aeb0_9324a6e4c593 -->|method| 27962678_c1e0_28d5_c9ad_e72077ac081d
  0ef5ddd6_b32b_de86_c992_89a39b04916c["buffer_as_messages()"]
  aad5faf6_b420_1801_aeb0_9324a6e4c593 -->|method| 0ef5ddd6_b32b_de86_c992_89a39b04916c
  e0ae4929_53b0_442f_5d7e_c2aff54da46e["memory_variables()"]
  aad5faf6_b420_1801_aeb0_9324a6e4c593 -->|method| e0ae4929_53b0_442f_5d7e_c2aff54da46e
  326c7313_c389_7832_0022_6b9004c2fffa["load_memory_variables()"]
  aad5faf6_b420_1801_aeb0_9324a6e4c593 -->|method| 326c7313_c389_7832_0022_6b9004c2fffa
  0691aa05_346f_c6dd_e63d_0940e98ba0c6["save_context()"]
  aad5faf6_b420_1801_aeb0_9324a6e4c593 -->|method| 0691aa05_346f_c6dd_e63d_0940e98ba0c6

Relationship Graph

Source Code

libs/langchain/langchain_classic/memory/token_buffer.py lines 19–71

class ConversationTokenBufferMemory(BaseChatMemory):
    """Conversation chat memory with token limit.

    Keeps only the most recent messages in the conversation under the constraint
    that the total number of tokens in the conversation does not exceed a certain limit.
    """

    human_prefix: str = "Human"
    ai_prefix: str = "AI"
    llm: BaseLanguageModel
    memory_key: str = "history"
    max_token_limit: int = 2000

    @property
    def buffer(self) -> Any:
        """String buffer of memory."""
        return self.buffer_as_messages if self.return_messages else self.buffer_as_str

    @property
    def buffer_as_str(self) -> str:
        """Exposes the buffer as a string in case return_messages is False."""
        return get_buffer_string(
            self.chat_memory.messages,
            human_prefix=self.human_prefix,
            ai_prefix=self.ai_prefix,
        )

    @property
    def buffer_as_messages(self) -> list[BaseMessage]:
        """Exposes the buffer as a list of messages in case return_messages is True."""
        return self.chat_memory.messages

    @property
    def memory_variables(self) -> list[str]:
        """Will always return list of memory variables."""
        return [self.memory_key]

    @override
    def load_memory_variables(self, inputs: dict[str, Any]) -> dict[str, Any]:
        """Return history buffer."""
        return {self.memory_key: self.buffer}

    def save_context(self, inputs: dict[str, Any], outputs: dict[str, str]) -> None:
        """Save context from this conversation to buffer. Pruned."""
        super().save_context(inputs, outputs)
        # Prune buffer if it exceeds max token limit
        buffer = self.chat_memory.messages
        curr_buffer_length = self.llm.get_num_tokens_from_messages(buffer)
        if curr_buffer_length > self.max_token_limit:
            pruned_memory = []
            while curr_buffer_length > self.max_token_limit:
                pruned_memory.append(buffer.pop(0))
                curr_buffer_length = self.llm.get_num_tokens_from_messages(buffer)

Extends

Frequently Asked Questions

What is the ConversationTokenBufferMemory class?
ConversationTokenBufferMemory is a class in the langchain codebase, defined in libs/langchain/langchain_classic/memory/token_buffer.py.
Where is ConversationTokenBufferMemory defined?
ConversationTokenBufferMemory is defined in libs/langchain/langchain_classic/memory/token_buffer.py at line 19.
What does ConversationTokenBufferMemory extend?
ConversationTokenBufferMemory extends BaseChatMemory.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free