Langchain 的 ConversationBufferWindowMemory 似乎无论配置如何都在内存中

问题描述 投票:0回答:1

我使用 Langchain 的 ConversationBufferWindowMemory(k=1),期望模型只记住最近的交互,而不是整个对话历史记录。然而,当我检查时,我发现较旧的交互并未按预期从内存中清除。我该如何解决这个问题?

即使对话没有从内存中删除,模型也无法从之前的交互中做出回答。

from langchain_ollama import ChatOllama
from langchain.chains.conversation.memory import ConversationBufferWindowMemory
from langchain.chains import ConversationChain

llm = ChatOllama(
    model="llama3-chatqa:latest",
    temperature=0,
)

conversation = ConversationChain(
    llm=llm,
    memory=ConversationBufferWindowMemory(k=1)
)

while True:
    message = input("Human: ")
    if message == "Done":
        break

    # Print the input message
    print("\n \n ########## Input:")
    print(message)

      # Print the current context before invoking the model
    print("########## Context:")
    print(conversation.memory)

    response = conversation.invoke(message)["response"]

    # Print the response
    print("########## AI response:")
    print(response)

如果您观察下面的输出,似乎我的第一次对话也在上下文中进行。

但是即使记忆拥有一切,模型也无法回答历史的问题

Human: my aim is to learn machine learning today 


 ########## Input:
my aim is to learn machine learning today
########## Context:
chat_memory=InMemoryChatMessageHistory(messages=[]) k=1
########## AI response:
 Great! I can help you with that. What specifically would you like to learn about machine learning?


Human: who is the PM of India?
 ########## Input:
who is the PM of India?
########## Context:
chat_memory=InMemoryChatMessageHistory(messages=[HumanMessage(content='my aim is to learn machine learning today', additional_kwargs={}, response_metadata={}), AIMessage(content=' Great! I can help you with that. What specifically would you like to learn about machine learning?', additional_kwargs={}, response_metadata={})]) k=1
########## AI response:
 Prime Minister Narendra Modi


Human: Who is vladmir putin ?
 ########## Input:
Who is vladmir putin ?
########## Context:
chat_memory=InMemoryChatMessageHistory(messages=[HumanMessage(content='my aim is to learn machine learning today', additional_kwargs={}, response_metadata={}), AIMessage(content=' Great! I can help you with that. What specifically would you like to learn about machine learning?', additional_kwargs={}, response_metadata={}), HumanMessage(content='who is the PM of India?', additional_kwargs={}, response_metadata={}), AIMessage(content=' Prime Minister Narendra Modi', additional_kwargs={}, response_metadata={})]) k=1
########## AI response:
 Vladimir Putin


Human: what is my aim for today ?
 ########## Input:
what is my aim for today ?
########## Context:
chat_memory=InMemoryChatMessageHistory(messages=[HumanMessage(content='my aim is to learn machine learning today', additional_kwargs={}, response_metadata={}), AIMessage(content=' Great! I can help you with that. What specifically would you like to learn about machine learning?', additional_kwargs={}, response_metadata={}), HumanMessage(content='who is the PM of India?', additional_kwargs={}, response_metadata={}), AIMessage(content=' Prime Minister Narendra Modi', additional_kwargs={}, response_metadata={}), HumanMessage(content='Who is vladmir putin ?', additional_kwargs={}, response_metadata={}), AIMessage(content=' Vladimir Putin', additional_kwargs={}, response_metadata={})]) k=1
########## AI response:
 Your aim for today is to get up and go about your day.
langchain llama ollama
1个回答
0
投票

即使它具有所有先前对话的记忆,我也只是将过去的 k 个对话传递给模型进行预测。

这里的主要座右铭是优化令牌大小而不是主内存。

所以我觉得最好也编写自己的代码来优化内存。

© www.soinside.com 2019 - 2024. All rights reserved.