chat_memory.py — langchain Source File
Architecture documentation for chat_memory.py, a python file in the langchain codebase. 9 imports, 0 dependents.
Entity Profile
Dependency Diagram
graph LR ee92ab40_77fa_9411_a52d_b678f3a6d677["chat_memory.py"] f3365e3c_fb7a_bb9a_bc79_059b06cb7024["warnings"] ee92ab40_77fa_9411_a52d_b678f3a6d677 --> f3365e3c_fb7a_bb9a_bc79_059b06cb7024 50e20440_a135_6be3_a5a5_67791be5a2a6["abc"] ee92ab40_77fa_9411_a52d_b678f3a6d677 --> 50e20440_a135_6be3_a5a5_67791be5a2a6 feec1ec4_6917_867b_d228_b134d0ff8099["typing"] ee92ab40_77fa_9411_a52d_b678f3a6d677 --> feec1ec4_6917_867b_d228_b134d0ff8099 2485b66a_3839_d0b6_ad9c_a4ff40457dc6["langchain_core._api"] ee92ab40_77fa_9411_a52d_b678f3a6d677 --> 2485b66a_3839_d0b6_ad9c_a4ff40457dc6 b70220ee_230d_1b24_69ea_cc9490f5f3c0["langchain_core.chat_history"] ee92ab40_77fa_9411_a52d_b678f3a6d677 --> b70220ee_230d_1b24_69ea_cc9490f5f3c0 9444498b_8066_55c7_b3a2_1d90c4162a32["langchain_core.messages"] ee92ab40_77fa_9411_a52d_b678f3a6d677 --> 9444498b_8066_55c7_b3a2_1d90c4162a32 dd5e7909_a646_84f1_497b_cae69735550e["pydantic"] ee92ab40_77fa_9411_a52d_b678f3a6d677 --> dd5e7909_a646_84f1_497b_cae69735550e 8d1ab66e_47c1_1140_c3a5_5112af3b1cac["langchain_classic.base_memory"] ee92ab40_77fa_9411_a52d_b678f3a6d677 --> 8d1ab66e_47c1_1140_c3a5_5112af3b1cac 195e5a37_104f_e0be_e49a_f5d06550a9be["langchain_classic.memory.utils"] ee92ab40_77fa_9411_a52d_b678f3a6d677 --> 195e5a37_104f_e0be_e49a_f5d06550a9be style ee92ab40_77fa_9411_a52d_b678f3a6d677 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
import warnings
from abc import ABC
from typing import Any
from langchain_core._api import deprecated
from langchain_core.chat_history import (
BaseChatMessageHistory,
InMemoryChatMessageHistory,
)
from langchain_core.messages import AIMessage, HumanMessage
from pydantic import Field
from langchain_classic.base_memory import BaseMemory
from langchain_classic.memory.utils import get_prompt_input_key
@deprecated(
since="0.3.1",
removal="1.0.0",
message=(
"Please see the migration guide at: "
"https://python.langchain.com/docs/versions/migrating_memory/"
),
)
class BaseChatMemory(BaseMemory, ABC):
"""Abstract base class for chat memory.
**ATTENTION** This abstraction was created prior to when chat models had
native tool calling capabilities.
It does **NOT** support native tool calling capabilities for chat models and
will fail SILENTLY if used with a chat model that has native tool calling.
DO NOT USE THIS ABSTRACTION FOR NEW CODE.
"""
chat_memory: BaseChatMessageHistory = Field(
default_factory=InMemoryChatMessageHistory,
)
output_key: str | None = None
input_key: str | None = None
return_messages: bool = False
def _get_input_output(
self,
inputs: dict[str, Any],
outputs: dict[str, str],
) -> tuple[str, str]:
if self.input_key is None:
prompt_input_key = get_prompt_input_key(inputs, self.memory_variables)
else:
prompt_input_key = self.input_key
if self.output_key is None:
if len(outputs) == 1:
output_key = next(iter(outputs.keys()))
elif "output" in outputs:
output_key = "output"
warnings.warn(
f"'{self.__class__.__name__}' got multiple output keys:"
f" {outputs.keys()}. The default 'output' key is being used."
f" If this is not desired, please manually set 'output_key'.",
stacklevel=3,
)
else:
msg = (
f"Got multiple output keys: {outputs.keys()}, cannot "
f"determine which to store in memory. Please set the "
f"'output_key' explicitly."
)
raise ValueError(msg)
else:
output_key = self.output_key
return inputs[prompt_input_key], outputs[output_key]
def save_context(self, inputs: dict[str, Any], outputs: dict[str, str]) -> None:
"""Save context from this conversation to buffer."""
input_str, output_str = self._get_input_output(inputs, outputs)
self.chat_memory.add_messages(
[
HumanMessage(content=input_str),
AIMessage(content=output_str),
],
)
async def asave_context(
self,
inputs: dict[str, Any],
outputs: dict[str, str],
) -> None:
"""Save context from this conversation to buffer."""
input_str, output_str = self._get_input_output(inputs, outputs)
await self.chat_memory.aadd_messages(
[
HumanMessage(content=input_str),
AIMessage(content=output_str),
],
)
def clear(self) -> None:
"""Clear memory contents."""
self.chat_memory.clear()
async def aclear(self) -> None:
"""Clear memory contents."""
await self.chat_memory.aclear()
Domain
Subdomains
Classes
Dependencies
- abc
- langchain_classic.base_memory
- langchain_classic.memory.utils
- langchain_core._api
- langchain_core.chat_history
- langchain_core.messages
- pydantic
- typing
- warnings
Source
Frequently Asked Questions
What does chat_memory.py do?
chat_memory.py is a source file in the langchain codebase, written in python. It belongs to the LangChainCore domain, LanguageModelBase subdomain.
What does chat_memory.py depend on?
chat_memory.py imports 9 module(s): abc, langchain_classic.base_memory, langchain_classic.memory.utils, langchain_core._api, langchain_core.chat_history, langchain_core.messages, pydantic, typing, and 1 more.
Where is chat_memory.py in the architecture?
chat_memory.py is located at libs/langchain/langchain_classic/memory/chat_memory.py (domain: LangChainCore, subdomain: LanguageModelBase, directory: libs/langchain/langchain_classic/memory).
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free