我需要一些帮助,我正在用 langchain 和 Pinecone 构建一个聊天机器人;我尝试使用我发现的不同链,但它们实际上都不能正常工作。创建带有内存、检索器、map_reduce 和指定机器人应仅回答知识的提示的链的最佳方法是什么在猎犬中? 我还有一个附加问题,检索器是否找到与问题高度相似的块/文档?有多少?
llm = ChatOpenAI(
openai_api_key=OPENAI_API_KEY,
model_name='gpt-3.5-turbo',
temperature=0.0
)
# conversational memory
conversational_memory = ConversationSummaryBufferMemory(
llm=llm,
memory_key='chat_history',
max_token_limit=1000,
return_messages=True
)
# retrieval qa chain
from langchain.chains import RetrievalQAWithSourcesChain
qa_with_sources = RetrievalQA.from_chain_type(
llm=llm,
chain_type="map_reduce",
retriever=vectorstore.as_retriever()
)
query = input("Ask me anything: ")
from langchain.agents import Tool
tools = [
Tool(
name='Knowledge Base',
func=qa_with_sources,
description=(
'use this tool to answer with the text retrieved only'
)
)
]
from langchain.agents import initialize_agent
agent = initialize_agent(
agent='chat-conversational-react-description',
tools=tools,
llm=llm,
verbose=True,
max_iterations=3,
early_stopping_method='generate',
memory=conversational_memory
)
您可以根据您的代码尝试以下 LangChain 代码,其中使用:
pipecone
vectorDB(您需要设置项目/索引,然后获取 API 密钥作为先决条件)VectorStoreRetrieverMemory
:通过记忆检索sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2
作为嵌入模型ConversationChain
有记忆from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
model_name='gpt-3.5-turbo',
temperature=0.0
)
from langchain_community.embeddings import HuggingFaceEmbeddings
embedding = HuggingFaceEmbeddings(
model_name="sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2")
from langchain_pinecone import PineconeVectorStore
from langchain.memory import VectorStoreRetrieverMemory
vectorstore = PineconeVectorStore.from_texts(
["Harry Potter's owl is in the castle."], embedding=embedding,
index_name="langchain-test-index")
retriever = vectorstore.as_retriever(search_kwargs=dict(k=2))
memory = VectorStoreRetrieverMemory(retriever=retriever)
from langchain.prompts import PromptTemplate
TEMPLATE = """You're a helpful assistant, aiming at solving the problem.
Relevant pieces of previous conversation:
{history}
(You do not need to use these pieces of information if not relevant)
Answer my question: {input}
"""
PROMPT = PromptTemplate(
input_variables=["history", "input"], template=TEMPLATE
)
from langchain.chains import ConversationChain
conversation_with_summary = ConversationChain(
llm=llm,
prompt=PROMPT,
memory=memory,
verbose=True
)
while True:
user_input = input("Enter your question: ")
if user_input == "exit":
break
else:
result = conversation_with_summary.predict(input=user_input)
print(result)