Home / Class/ BaseChatMemory Class — langchain Architecture

BaseChatMemory Class — langchain Architecture

Architecture documentation for the BaseChatMemory class in chat_memory.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  b010d3e5_8f8c_da0a_9a18_268dce3d2b0b["BaseChatMemory"]
  9071c852_df07_5016_b7fc_1000d3b44bc4["BaseMemory"]
  b010d3e5_8f8c_da0a_9a18_268dce3d2b0b -->|extends| 9071c852_df07_5016_b7fc_1000d3b44bc4
  f9a0a16c_2fba_69a9_496e_eeda3245d88d["chat_memory.py"]
  b010d3e5_8f8c_da0a_9a18_268dce3d2b0b -->|defined in| f9a0a16c_2fba_69a9_496e_eeda3245d88d
  17d3387d_9c2a_85db_7535_54d958ca2026["_get_input_output()"]
  b010d3e5_8f8c_da0a_9a18_268dce3d2b0b -->|method| 17d3387d_9c2a_85db_7535_54d958ca2026
  4b8c41f3_5a0f_48c1_32be_223c87e19548["save_context()"]
  b010d3e5_8f8c_da0a_9a18_268dce3d2b0b -->|method| 4b8c41f3_5a0f_48c1_32be_223c87e19548
  289fc1b0_bab4_43c1_c80f_f9f526139156["asave_context()"]
  b010d3e5_8f8c_da0a_9a18_268dce3d2b0b -->|method| 289fc1b0_bab4_43c1_c80f_f9f526139156
  942be141_54ff_5984_9897_d2185d13ea71["clear()"]
  b010d3e5_8f8c_da0a_9a18_268dce3d2b0b -->|method| 942be141_54ff_5984_9897_d2185d13ea71
  b7e49318_6c1b_6cfb_d06c_6db1e5fb45df["aclear()"]
  b010d3e5_8f8c_da0a_9a18_268dce3d2b0b -->|method| b7e49318_6c1b_6cfb_d06c_6db1e5fb45df

Relationship Graph

Source Code

libs/langchain/langchain_classic/memory/chat_memory.py lines 25–104

class BaseChatMemory(BaseMemory, ABC):
    """Abstract base class for chat memory.

    **ATTENTION** This abstraction was created prior to when chat models had
        native tool calling capabilities.
        It does **NOT** support native tool calling capabilities for chat models and
        will fail SILENTLY if used with a chat model that has native tool calling.

    DO NOT USE THIS ABSTRACTION FOR NEW CODE.
    """

    chat_memory: BaseChatMessageHistory = Field(
        default_factory=InMemoryChatMessageHistory,
    )
    output_key: str | None = None
    input_key: str | None = None
    return_messages: bool = False

    def _get_input_output(
        self,
        inputs: dict[str, Any],
        outputs: dict[str, str],
    ) -> tuple[str, str]:
        if self.input_key is None:
            prompt_input_key = get_prompt_input_key(inputs, self.memory_variables)
        else:
            prompt_input_key = self.input_key
        if self.output_key is None:
            if len(outputs) == 1:
                output_key = next(iter(outputs.keys()))
            elif "output" in outputs:
                output_key = "output"
                warnings.warn(
                    f"'{self.__class__.__name__}' got multiple output keys:"
                    f" {outputs.keys()}. The default 'output' key is being used."
                    f" If this is not desired, please manually set 'output_key'.",
                    stacklevel=3,
                )
            else:
                msg = (
                    f"Got multiple output keys: {outputs.keys()}, cannot "
                    f"determine which to store in memory. Please set the "
                    f"'output_key' explicitly."
                )
                raise ValueError(msg)
        else:
            output_key = self.output_key
        return inputs[prompt_input_key], outputs[output_key]

    def save_context(self, inputs: dict[str, Any], outputs: dict[str, str]) -> None:
        """Save context from this conversation to buffer."""
        input_str, output_str = self._get_input_output(inputs, outputs)
        self.chat_memory.add_messages(
            [
                HumanMessage(content=input_str),
                AIMessage(content=output_str),
            ],
        )

    async def asave_context(
        self,
        inputs: dict[str, Any],
        outputs: dict[str, str],
    ) -> None:
        """Save context from this conversation to buffer."""
        input_str, output_str = self._get_input_output(inputs, outputs)
        await self.chat_memory.aadd_messages(
            [
                HumanMessage(content=input_str),
                AIMessage(content=output_str),
            ],
        )

    def clear(self) -> None:
        """Clear memory contents."""
        self.chat_memory.clear()

    async def aclear(self) -> None:
        """Clear memory contents."""
        await self.chat_memory.aclear()

Extends

Frequently Asked Questions

What is the BaseChatMemory class?
BaseChatMemory is a class in the langchain codebase, defined in libs/langchain/langchain_classic/memory/chat_memory.py.
Where is BaseChatMemory defined?
BaseChatMemory is defined in libs/langchain/langchain_classic/memory/chat_memory.py at line 25.
What does BaseChatMemory extend?
BaseChatMemory extends BaseMemory.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free