Home / Class/ ConversationStringBufferMemory Class — langchain Architecture

ConversationStringBufferMemory Class — langchain Architecture

Architecture documentation for the ConversationStringBufferMemory class in buffer.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  1ba96309_8c4c_141b_b130_88f38c9c1dd9["ConversationStringBufferMemory"]
  1f107c9a_ba73_698c_b028_96ca5ade32ec["BaseMemory"]
  1ba96309_8c4c_141b_b130_88f38c9c1dd9 -->|extends| 1f107c9a_ba73_698c_b028_96ca5ade32ec
  b4796d6b_42cf_9939_9978_b9dcf5d6fb57["buffer.py"]
  1ba96309_8c4c_141b_b130_88f38c9c1dd9 -->|defined in| b4796d6b_42cf_9939_9978_b9dcf5d6fb57
  b05f6e2a_2867_6931_167a_0f4ff0a36bd1["validate_chains()"]
  1ba96309_8c4c_141b_b130_88f38c9c1dd9 -->|method| b05f6e2a_2867_6931_167a_0f4ff0a36bd1
  1847f7ce_09e0_8ccd_b605_a1b409dbbc0e["memory_variables()"]
  1ba96309_8c4c_141b_b130_88f38c9c1dd9 -->|method| 1847f7ce_09e0_8ccd_b605_a1b409dbbc0e
  f7645c0d_5b1a_49b0_f640_891e7517ddc6["load_memory_variables()"]
  1ba96309_8c4c_141b_b130_88f38c9c1dd9 -->|method| f7645c0d_5b1a_49b0_f640_891e7517ddc6
  6e91a5d8_1c61_2fdb_ea87_c320f47f4f20["aload_memory_variables()"]
  1ba96309_8c4c_141b_b130_88f38c9c1dd9 -->|method| 6e91a5d8_1c61_2fdb_ea87_c320f47f4f20
  61cfaae8_ec1f_bda3_4ebd_926b2c966831["save_context()"]
  1ba96309_8c4c_141b_b130_88f38c9c1dd9 -->|method| 61cfaae8_ec1f_bda3_4ebd_926b2c966831
  4ffe6c20_74c9_d3f6_79e4_1656bbcdeec0["asave_context()"]
  1ba96309_8c4c_141b_b130_88f38c9c1dd9 -->|method| 4ffe6c20_74c9_d3f6_79e4_1656bbcdeec0
  3a5bc5d8_6f0b_fad7_55b7_3c1301955210["clear()"]
  1ba96309_8c4c_141b_b130_88f38c9c1dd9 -->|method| 3a5bc5d8_6f0b_fad7_55b7_3c1301955210
  cf9fb2ca_0719_3239_3d12_bf2609475e39["aclear()"]
  1ba96309_8c4c_141b_b130_88f38c9c1dd9 -->|method| cf9fb2ca_0719_3239_3d12_bf2609475e39

Relationship Graph

Source Code

libs/langchain/langchain_classic/memory/buffer.py lines 99–173

class ConversationStringBufferMemory(BaseMemory):
    """A basic memory implementation that simply stores the conversation history.

    This stores the entire conversation history in memory without any
    additional processing.

    Equivalent to ConversationBufferMemory but tailored more specifically
    for string-based conversations rather than chat models.

    Note that additional processing may be required in some situations when the
    conversation history is too large to fit in the context window of the model.
    """

    human_prefix: str = "Human"
    ai_prefix: str = "AI"
    """Prefix to use for AI generated responses."""
    buffer: str = ""
    output_key: str | None = None
    input_key: str | None = None
    memory_key: str = "history"

    @pre_init
    def validate_chains(cls, values: dict) -> dict:
        """Validate that return messages is not True."""
        if values.get("return_messages", False):
            msg = "return_messages must be False for ConversationStringBufferMemory"
            raise ValueError(msg)
        return values

    @property
    def memory_variables(self) -> list[str]:
        """Will always return list of memory variables."""
        return [self.memory_key]

    @override
    def load_memory_variables(self, inputs: dict[str, Any]) -> dict[str, str]:
        """Return history buffer."""
        return {self.memory_key: self.buffer}

    async def aload_memory_variables(self, inputs: dict[str, Any]) -> dict[str, str]:
        """Return history buffer."""
        return self.load_memory_variables(inputs)

    def save_context(self, inputs: dict[str, Any], outputs: dict[str, str]) -> None:
        """Save context from this conversation to buffer."""
        if self.input_key is None:
            prompt_input_key = get_prompt_input_key(inputs, self.memory_variables)
        else:
            prompt_input_key = self.input_key
        if self.output_key is None:
            if len(outputs) != 1:
                msg = f"One output key expected, got {outputs.keys()}"
                raise ValueError(msg)
            output_key = next(iter(outputs.keys()))
        else:
            output_key = self.output_key
        human = f"{self.human_prefix}: " + inputs[prompt_input_key]
        ai = f"{self.ai_prefix}: " + outputs[output_key]
        self.buffer += f"\n{human}\n{ai}"

    async def asave_context(
        self,
        inputs: dict[str, Any],
        outputs: dict[str, str],
    ) -> None:
        """Save context from this conversation to buffer."""
        return self.save_context(inputs, outputs)

    def clear(self) -> None:
        """Clear memory contents."""
        self.buffer = ""

    @override
    async def aclear(self) -> None:
        self.clear()

Extends

Frequently Asked Questions

What is the ConversationStringBufferMemory class?
ConversationStringBufferMemory is a class in the langchain codebase, defined in libs/langchain/langchain_classic/memory/buffer.py.
Where is ConversationStringBufferMemory defined?
ConversationStringBufferMemory is defined in libs/langchain/langchain_classic/memory/buffer.py at line 99.
What does ConversationStringBufferMemory extend?
ConversationStringBufferMemory extends BaseMemory.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free