Pydantic 验证错误 None 是不允许的(使用 LlamaIndex Raise Error)

问题描述 投票:0回答:1

我正在使用 llamaindex 和 nebulagraph 来构建我的知识图谱,并且我已经完成了。现在我正在尝试另一种名为 KnowledgeGraphQueryEngine 的查询策略。代码如下:

from llama_index.core.query_engine import KnowledgeGraphQueryEngine
kgqe_query_engine = KnowledgeGraphQueryEngine(
    storage_context=storage_context,
    service_context=service_context,
    llm=llm,
    verbose=True
)
response = kgqe_query_engine.query(
    "外观检测有什么作用",
)
display(Markdown(f"<b>{response}</b>"))

我确信在其他一些已经完成的查询策略中可以更正导入'storage_context'和'service_context'参数。 我尝试运行这个块,它引发了验证错误:

ValidationError    Traceback (most recent call last)
Cell In\[60\], line 8


1 from llama_index.core.query_engine import KnowledgeGraphQueryEngine
2 kgqe_query_engine = KnowledgeGraphQueryEngine(
3     storage_context=storage_context,
4     service_context=service_context,
5     llm=llm,
6     verbose=True,
7 )
\----\> 8 response = kgqe_query_engine.query(
9     "外观检测有什么作用",
10 )
11 display(Markdown(f"<b>{response}</b>"))


File \~/Library/Python/3.9/lib/python/site-packages/llama_index/core/instrumentation/dispatcher.py:274, in Dispatcher.span.\<locals\>.wrapper(func, instance, args, kwargs)
270 self.span_enter(
271     id\_=id\_, bound_args=bound_args, instance=instance, parent_id=parent_id
272 )
273 try:
\--\> 274     result = func(\*args, \*\*kwargs)
275 except BaseException as e:
276     self.event(SpanDropEvent(span_id=id\_, err_str=str(e)))

File \~/Library/Python/3.9/lib/python/site-packages/llama_index/core/base/base_query_engine.py:53, in BaseQueryEngine.query(self, str_or_query_bundle)
51     if isinstance(str_or_query_bundle, str):
52         str_or_query_bundle = QueryBundle(str_or_query_bundle)
\---\> 53     query_result = self.\_query(str_or_query_bundle)
54 dispatch_event(QueryEndEvent(query=str_or_query_bundle, response=query_result))
55 return query_result

File \~/Library/Python/3.9/lib/python/site-packages/llama_index/core/query_engine/knowledge_graph_query_engine.py:199, in KnowledgeGraphQueryEngine.\_query(self, query_bundle)
195 """Query the graph store."""
196 with self.callback_manager.event(
197     CBEventType.QUERY, payload={EventPayload.QUERY_STR: query_bundle.query_str}
198 ) as query_event:
\--\> 199     nodes: List\[NodeWithScore\] = self.\_retrieve(query_bundle)
201     response = self.\_response_synthesizer.synthesize(
202         query=query_bundle,
203         nodes=nodes,
204     )
206     if self.\_verbose:

File \~/Library/Python/3.9/lib/python/site-packages/llama_index/core/query_engine/knowledge_graph_query_engine.py:154, in KnowledgeGraphQueryEngine.\_retrieve(self, query_bundle)
152 def \_retrieve(self, query_bundle: QueryBundle) -\> List\[NodeWithScore\]:
153     """Get nodes for response."""
\--\> 154     graph_store_query = self.generate_query(query_bundle.query_str)
155     if self.\_verbose:
156         print_text(f"Graph Store Query:\\n{graph_store_query}\\n", color="yellow")

File \~/Library/Python/3.9/lib/python/site-packages/llama_index/core/query_engine/knowledge_graph_query_engine.py:132, in KnowledgeGraphQueryEngine.generate_query(self, query_str)
129 """Generate a Graph Store Query from a query bundle."""
130 # Get the query engine query string
\--\> 132 graph_store_query: str = self.\_llm.predict(
133     self.\_graph_query_synthesis_prompt,
134     query_str=query_str,
135     schema=self.\_graph_schema,
136 )
138 return graph_store_query

File \~/Library/Python/3.9/lib/python/site-packages/llama_index/core/instrumentation/dispatcher.py:274, in Dispatcher.span.\<locals\>.wrapper(func, instance, args, kwargs)
270 self.span_enter(
271     id\_=id\_, bound_args=bound_args, instance=instance, parent_id=parent_id
272 )
273 try:
\--\> 274     result = func(\*args, \*\*kwargs)
275 except BaseException as e:
276     self.event(SpanDropEvent(span_id=id\_, err_str=str(e)))

File \~/Library/Python/3.9/lib/python/site-packages/llama_index/core/llms/llm.py:433, in LLM.predict(self, prompt, \*\*prompt_args)
411 """Predict for a given prompt.
412
413 Args:
(...)
429     \`\`\`
430 """
431 dispatch_event = dispatcher.get_dispatch_event()
\--\> 433 dispatch_event(LLMPredictStartEvent(template=prompt, template_args=prompt_args))
434 self.\_log_template_data(prompt, \*\*prompt_args)
436 if self.metadata.is_chat_model:

File \~/Library/Python/3.9/lib/python/site-packages/pydantic/v1/main.py:341, in BaseModel.__init__(__pydantic_self__, \*\*data)
339 values, fields_set, validation_error = validate_model(__pydantic_self__.__class__, data)
340 if validation_error:
\--\> 341     raise validation_error
342 try:
343     object_setattr(__pydantic_self__, '__dict__', values)

ValidationError: 1 validation error for LLMPredictStartEvent
template
none is not an allowed value (type=type_error.none.not_allowed)

看起来错误来自 Pydantic Basemodel。但我不知道如何回答这个问题。

在我的编码实验中,llamaindex的版本为3.5.0,pydantic的版本为2.7.1。

pydantic llama-index graph-query
1个回答
0
投票

这个问题解决了吗?我也遇到同样的问题。

© www.soinside.com 2019 - 2024. All rights reserved.