With LLMChain
deprecated in LangChain v0.2, I am struggling to get ConversationSummaryMemory
working again.
My chatbot is using RunnableWithMessageHistory
with FileChatMessageHistory
like this:
prompt = ChatPromptTemplate.from_messages([
MessagesPlaceholder(variable_name="messages"),
HumanMessagePromptTemplate.from_template("{content}"),
])
chain = prompt | chat;
def get_session_history(session_id: str) -> BaseChatMessageHistory:
return FileChatMessageHistory(f"messages_{session_id}.json");
with_message_history = RunnableWithMessageHistory(
chain,
get_session_history=get_session_history,
input_messages_key="content",
history_messages_key="messages",
);
while True:
content = input(">> ");
result = with_message_history.invoke(
input={
"content": content,
},
config={
"configurable": {"session_id": "abc123"}
}
);
print(result.content);
Instead of remembering all messages, I would like them summarized. With LLMChain
I was able to use ConversationSummaryMemory
. Now in v0.2 I cannot use it with RunnableWithMessageHistory
, because ConversationSummaryMemory
is not a subclass of BaseChatMessageHistory
, which I could return from get_session_history()
.
What is the recommended best practice to remember summarized history with RunnableWithMessageHistory
in LangChain v0.2?