我有一个接受 Langugaue 模型、向量存储、问题和工具的函数;并返回响应,目前尚未添加工具参数,因为基于此示例,函数
.bind_tools
不是llm
的属性 -> llm 位于下面
## Bedrock Client
bedrock_client = boto3.client(service_name="bedrock-runtime", region_name="us-west-2")
bedrock_embeddings = BedrockEmbeddings(model_id="amazon.titan-embed-text-v1", client=bedrock_client)
llm=Bedrock(model_id="anthropic.claude-v2:1", client=bedrock_client,
model_kwargs={'max_tokens_to_sample': 512})
无需将 LLM 更改为
ChatOpenAPI
,如示例参考如何将工具绑定到 langchain 基岩。
我也尝试过工具渲染但下面不工作是我的主要获取响应函数
def get_response(llm, vectorstore, question, tools ):
## create prompt / template this helps to guide the AI on what to look out for and how to answer
prompt_template = """
System: You are a helpful ai bot, your name is Alex, you are to provide information to humans based on faq and user information, in the user information provided you are to extract the users' firstName and lastName from the json payload and recognize that as the persons name. use the currencyVerificationData to determine the number of currency accounts that the user has and if they are approved if the status is VALID, other statuses will indicate that the user is not yet approved and needs to provide more information for validation. use bankFilledData as the users beneficiaries, from that section of the payload you would be able to extract the beneficiaries bankName, bankAccountNumber; use accountDetails as information for bank account detail information;
Human: Please use the given context to provide concise answer to the question
If you don't know the answer, just say that you don't know, don't try to make up an answer.
If you need clarity, ask more questions, do not refer to the json payload when answering questions just use the values you retrieve from the payload to answer
<context>
{context}
</context>
The way you use the information is to identify users name and use it in response
Question: {question}
Assistant:"""
# llm.bind_tools(tools) // not working, python error attribute not found
PROMPT = PromptTemplate(
template=prompt_template, input_variables=["context", "question", "user_information"]
)
qa = RetrievalQA.from_chain_type(
llm=llm,
chain_type="stuff",
retriever=vectorstore.as_retriever(
search_type="similarity", search_kwargs={"k": 5}
),
return_source_documents=True,
chain_type_kwargs={"prompt": PROMPT}
)
answer=qa({"query":question})
return answer['result']
最后我希望实现的只是一种根据用户输入调用函数的方法
由于没有提供导入或库版本,我假设您使用了 langchain_community 中的 Bedrock,并将
llm
分配给 Bedrock
。 Bedrock
类不包含 bind_tools
方法,因此未找到。
ChatBedrock
。